These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
2. The AIC criterion and symmetrizing the Kullback-Leibler divergence. Seghouane AK; Amari S IEEE Trans Neural Netw; 2007 Jan; 18(1):97-106. PubMed ID: 17278464 [TBL] [Abstract][Full Text] [Related]
3. Entropy estimation in Turing's perspective. Zhang Z Neural Comput; 2012 May; 24(5):1368-89. PubMed ID: 22295985 [TBL] [Abstract][Full Text] [Related]
4. Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence. Noh YK; Sugiyama M; Liu S; Plessis MCD; Park FC; Lee DD Neural Comput; 2018 Jul; 30(7):1930-1960. PubMed ID: 29902113 [TBL] [Abstract][Full Text] [Related]
5. MODEL AVERAGING BASED ON KULLBACK-LEIBLER DISTANCE. Zhang X; Zou G; Carroll RJ Stat Sin; 2015; 25():1583-1598. PubMed ID: 27761098 [TBL] [Abstract][Full Text] [Related]
6. Entropy production and Kullback-Leibler divergence between stationary trajectories of discrete systems. Roldán E; Parrondo JM Phys Rev E Stat Nonlin Soft Matter Phys; 2012 Mar; 85(3 Pt 1):031129. PubMed ID: 22587060 [TBL] [Abstract][Full Text] [Related]
7. Bayesian estimation of the Kullback-Leibler divergence for categorical systems using mixtures of Dirichlet priors. Camaglia F; Nemenman I; Mora T; Walczak AM Phys Rev E; 2024 Feb; 109(2-1):024305. PubMed ID: 38491647 [TBL] [Abstract][Full Text] [Related]
8. A mutual information estimator with exponentially decaying bias. Zhang Z; Zheng L Stat Appl Genet Mol Biol; 2015 Jun; 14(3):243-52. PubMed ID: 25941916 [TBL] [Abstract][Full Text] [Related]
9. Nonparametric identification and maximum likelihood estimation for hidden Markov models. Alexandrovich G; Holzmann H; Leister A Biometrika; 2016 Jun; 103(2):423-434. PubMed ID: 27279667 [TBL] [Abstract][Full Text] [Related]
10. Asymptotic optimality of likelihood-based cross-validation. van der Laan MJ; Dudoit S; Keles S Stat Appl Genet Mol Biol; 2004; 3():Article4. PubMed ID: 16646820 [TBL] [Abstract][Full Text] [Related]
11. Integration of stochastic models by minimizing alpha-divergence. Amari S Neural Comput; 2007 Oct; 19(10):2780-96. PubMed ID: 17716012 [TBL] [Abstract][Full Text] [Related]
12. Distributions of the Kullback-Leibler divergence with applications. Belov DI; Armstrong RD Br J Math Stat Psychol; 2011 May; 64(Pt 2):291-309. PubMed ID: 21492134 [TBL] [Abstract][Full Text] [Related]
13. Information estimators for weighted observations. Hino H; Murata N Neural Netw; 2013 Oct; 46():260-75. PubMed ID: 23859828 [TBL] [Abstract][Full Text] [Related]
14. Investigating the performance of AIC in selecting phylogenetic models. Jhwueng DC; Huzurbazar S; O'Meara BC; Liu L Stat Appl Genet Mol Biol; 2014 Aug; 13(4):459-75. PubMed ID: 24867284 [TBL] [Abstract][Full Text] [Related]
15. Minimax Estimation of Functionals of Discrete Distributions. Jiao J; Venkat K; Han Y; Weissman T IEEE Trans Inf Theory; 2015 May; 61(5):2835-2885. PubMed ID: 29375152 [TBL] [Abstract][Full Text] [Related]
16. Identification of directed influence: Granger causality, Kullback-Leibler divergence, and complexity. Seghouane AK; Amari S Neural Comput; 2012 Jul; 24(7):1722-39. PubMed ID: 22428593 [TBL] [Abstract][Full Text] [Related]
18. Stratified doubly robust estimators for the average causal effect. Hattori S; Henmi M Biometrics; 2014 Jun; 70(2):270-7. PubMed ID: 24571129 [TBL] [Abstract][Full Text] [Related]
19. A Kullback-Leibler Divergence for Bayesian Model Diagnostics. Wang CP; Ghosh M Open J Stat; 2011 Oct; 1(3):172-184. PubMed ID: 25414801 [TBL] [Abstract][Full Text] [Related]
20. Nonparametric estimation of the concordance correlation coefficient under univariate censoring. Guo Y; Manatunga AK Biometrics; 2007 Mar; 63(1):164-72. PubMed ID: 17447941 [TBL] [Abstract][Full Text] [Related] [Next] [New Search]