These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
4. Compression for Similarity Identification: Computing the Error Exponent. Ingber A; Weissman T Proc Data Compress Conf; 2015 Apr; 2015():413-422. PubMed ID: 29046895 [TBL] [Abstract][Full Text] [Related]
5. Detection Games under Fully Active Adversaries. Tondi B; Merhav N; Barni M Entropy (Basel); 2018 Dec; 21(1):. PubMed ID: 33266739 [TBL] [Abstract][Full Text] [Related]
6. Distributed Hypothesis Testing over a Noisy Channel: Error-Exponents Trade-Off. Sreekumar S; Gündüz D Entropy (Basel); 2023 Feb; 25(2):. PubMed ID: 36832670 [TBL] [Abstract][Full Text] [Related]
7. Optimum Achievable Rates in Two Random Number Generation Problems with Nomura R; Yagi H Entropy (Basel); 2024 Sep; 26(9):. PubMed ID: 39330099 [TBL] [Abstract][Full Text] [Related]
8. Network Compression: Worst Case Analysis. Asnani H; Shomorony I; Avestimehr AS; Weissman T IEEE Trans Inf Theory; 2015 Jul; 61(7):3980-3995. PubMed ID: 29375153 [TBL] [Abstract][Full Text] [Related]
19. Anomalous scaling, nonlocality, and anisotropy in a model of the passively advected vector field. Adzhemyan LT; Antonov NV; Runov AV Phys Rev E Stat Nonlin Soft Matter Phys; 2001 Oct; 64(4 Pt 2):046310. PubMed ID: 11690149 [TBL] [Abstract][Full Text] [Related]
20. A Survey on Error Exponents in Distributed Hypothesis Testing: Connections with Information Theory, Interpretations, and Applications. Espinosa S; Silva JF; Céspedes S Entropy (Basel); 2024 Jul; 26(7):. PubMed ID: 39056958 [TBL] [Abstract][Full Text] [Related] [Next] [New Search]