These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

117 related articles for article (PubMed ID: 33716357)

  • 1. Comparing and weighting imperfect models using D-probabilities.
    Li M; Dunson DB
    J Am Stat Assoc; 2020; 115(531):1349-1360. PubMed ID: 33716357
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Bayesian Data Selection.
    Weinstein EN; Miller JW
    J Mach Learn Res; 2023; 24(23):. PubMed ID: 37206375
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Applications of a Kullback-Leibler Divergence for Comparing Non-nested Models.
    Wang CP; Jo B
    Stat Modelling; 2013 Dec; 13(5-6):409-429. PubMed ID: 24795532
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Computation of Kullback-Leibler Divergence in Bayesian Networks.
    Moral S; Cano A; Gómez-Olmedo M
    Entropy (Basel); 2021 Aug; 23(9):. PubMed ID: 34573747
    [TBL] [Abstract][Full Text] [Related]  

  • 5. A Dirichlet Process Prior Approach for Covariate Selection.
    Cabras S
    Entropy (Basel); 2020 Aug; 22(9):. PubMed ID: 33286717
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Kullback-Leibler Divergence Based Probabilistic Approach for Device-Free Localization Using Channel State Information.
    Gao R; Zhang J; Xiao W; Li Y
    Sensors (Basel); 2019 Nov; 19(21):. PubMed ID: 31684166
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Systematic Bayesian posterior analysis guided by Kullback-Leibler divergence facilitates hypothesis formation.
    Huber HA; Georgia SK; Finley SD
    J Theor Biol; 2023 Feb; 558():111341. PubMed ID: 36335999
    [TBL] [Abstract][Full Text] [Related]  

  • 8. A Kullback-Leibler Divergence for Bayesian Model Diagnostics.
    Wang CP; Ghosh M
    Open J Stat; 2011 Oct; 1(3):172-184. PubMed ID: 25414801
    [TBL] [Abstract][Full Text] [Related]  

  • 9. The AIC criterion and symmetrizing the Kullback-Leibler divergence.
    Seghouane AK; Amari S
    IEEE Trans Neural Netw; 2007 Jan; 18(1):97-106. PubMed ID: 17278464
    [TBL] [Abstract][Full Text] [Related]  

  • 10. A Bayesian Motivated Two-Sample Test Based on Kernel Density Estimates.
    Merchant N; Hart JD
    Entropy (Basel); 2022 Aug; 24(8):. PubMed ID: 36010735
    [TBL] [Abstract][Full Text] [Related]  

  • 11. An evaluation of prior influence on the predictive ability of Bayesian model averaging.
    St-Louis V; Clayton MK; Pidgeon AM; Radeloff VC
    Oecologia; 2012 Mar; 168(3):719-26. PubMed ID: 21947451
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Bayesian estimation of the Kullback-Leibler divergence for categorical systems using mixtures of Dirichlet priors.
    Camaglia F; Nemenman I; Mora T; Walczak AM
    Phys Rev E; 2024 Feb; 109(2-1):024305. PubMed ID: 38491647
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Nonparametric estimation of Küllback-Leibler divergence.
    Zhang Z; Grabchak M
    Neural Comput; 2014 Nov; 26(11):2570-93. PubMed ID: 25058703
    [TBL] [Abstract][Full Text] [Related]  

  • 14. A menu-driven software package of Bayesian nonparametric (and parametric) mixed models for regression analysis and density estimation.
    Karabatsos G
    Behav Res Methods; 2017 Feb; 49(1):335-362. PubMed ID: 26956682
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Precise periodic components estimation for chronobiological signals through Bayesian Inference with sparsity enforcing prior.
    Dumitru M; Mohammad-Djafari A; Sain SB
    EURASIP J Bioinform Syst Biol; 2016 Dec; 2016(1):3. PubMed ID: 26834783
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Posterior Averaging Information Criterion.
    Zhou S
    Entropy (Basel); 2023 Mar; 25(3):. PubMed ID: 36981356
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Nonparametric identification and maximum likelihood estimation for hidden Markov models.
    Alexandrovich G; Holzmann H; Leister A
    Biometrika; 2016 Jun; 103(2):423-434. PubMed ID: 27279667
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Fast Approximations of the Jeffreys Divergence between Univariate Gaussian Mixtures via Mixture Conversions to Exponential-Polynomial Distributions.
    Nielsen F
    Entropy (Basel); 2021 Oct; 23(11):. PubMed ID: 34828115
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Principles of Bayesian Inference Using General Divergence Criteria.
    Jewson J; Smith JQ; Holmes C
    Entropy (Basel); 2018 Jun; 20(6):. PubMed ID: 33265532
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Knot selection in sparse Gaussian processes with a variational objective function.
    Garton N; Niemi J; Carriquiry A
    Stat Anal Data Min; 2020 Aug; 13(4):324-336. PubMed ID: 32742538
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 6.