These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

111 related articles for article (PubMed ID: 38935471)

  • 21. Non-Structured DNN Weight Pruning-Is It Beneficial in Any Platform?
    Ma X; Lin S; Ye S; He Z; Zhang L; Yuan G; Tan SH; Li Z; Fan D; Qian X; Lin X; Ma K; Wang Y
    IEEE Trans Neural Netw Learn Syst; 2022 Sep; 33(9):4930-4944. PubMed ID: 33735086
    [TBL] [Abstract][Full Text] [Related]  

  • 22. Resource-constrained FPGA/DNN co-design.
    Zhang Z; Kouzani AZ
    Neural Comput Appl; 2021; 33(21):14741-14751. PubMed ID: 34025038
    [TBL] [Abstract][Full Text] [Related]  

  • 23. TECO: A Unified Feature Map Compression Framework Based on Transform and Entropy.
    Shi Y; Wang M; Cao T; Lin J; Wang Z
    IEEE Trans Neural Netw Learn Syst; 2023 Sep; PP():. PubMed ID: 37703155
    [TBL] [Abstract][Full Text] [Related]  

  • 24. PCA driven mixed filter pruning for efficient convNets.
    Ahmed W; Ansari S; Hanif M; Khalil A
    PLoS One; 2022; 17(1):e0262386. PubMed ID: 35073373
    [TBL] [Abstract][Full Text] [Related]  

  • 25. Divergences in color perception between deep neural networks and humans.
    Nadler EO; Darragh-Ford E; Desikan BS; Conaway C; Chu M; Hull T; Guilbeault D
    Cognition; 2023 Dec; 241():105621. PubMed ID: 37716312
    [TBL] [Abstract][Full Text] [Related]  

  • 26. Compression of Deep Neural Networks based on quantized tensor decomposition to implement on reconfigurable hardware platforms.
    Nekooei A; Safari S
    Neural Netw; 2022 Jun; 150():350-363. PubMed ID: 35344706
    [TBL] [Abstract][Full Text] [Related]  

  • 27. Lite It Fly: An All-Deformable-Butterfly Network.
    Lin R; Li JCL; Zhou J; Huang B; Ran J; Wong N
    IEEE Trans Neural Netw Learn Syst; 2023 Nov; PP():. PubMed ID: 38015682
    [TBL] [Abstract][Full Text] [Related]  

  • 28. An effective low-rank compression with a joint rank selection followed by a compression-friendly training.
    Eo M; Kang S; Rhee W
    Neural Netw; 2023 Apr; 161():165-177. PubMed ID: 36745941
    [TBL] [Abstract][Full Text] [Related]  

  • 29. CRESPR: Modular sparsification of DNNs to improve pruning performance and model interpretability.
    Kang T; Ding W; Chen P
    Neural Netw; 2024 Apr; 172():106067. PubMed ID: 38199151
    [TBL] [Abstract][Full Text] [Related]  

  • 30. Compact Model Training by Low-Rank Projection With Energy Transfer.
    Guo K; Lin Z; Chen C; Xing X; Liu F; Xu X
    IEEE Trans Neural Netw Learn Syst; 2024 Jun; PP():. PubMed ID: 38843062
    [TBL] [Abstract][Full Text] [Related]  

  • 31. Dynamic Spatial Sparsification for Efficient Vision Transformers and Convolutional Neural Networks.
    Rao Y; Liu Z; Zhao W; Zhou J; Lu J
    IEEE Trans Pattern Anal Mach Intell; 2023 Sep; 45(9):10883-10897. PubMed ID: 37030709
    [TBL] [Abstract][Full Text] [Related]  

  • 32. Weak sub-network pruning for strong and efficient neural networks.
    Guo Q; Wu XJ; Kittler J; Feng Z
    Neural Netw; 2021 Dec; 144():614-626. PubMed ID: 34653719
    [TBL] [Abstract][Full Text] [Related]  

  • 33. A Progressive Subnetwork Searching Framework for Dynamic Inference.
    Yang L; He Z; Cao Y; Fan D
    IEEE Trans Neural Netw Learn Syst; 2024 Mar; 35(3):3809-3820. PubMed ID: 36063528
    [TBL] [Abstract][Full Text] [Related]  

  • 34. Kronecker CP Decomposition With Fast Multiplication for Compressing RNNs.
    Wang D; Wu B; Zhao G; Yao M; Chen H; Deng L; Yan T; Li G
    IEEE Trans Neural Netw Learn Syst; 2023 May; 34(5):2205-2219. PubMed ID: 34534089
    [TBL] [Abstract][Full Text] [Related]  

  • 35. Feature flow regularization: Improving structured sparsity in deep neural networks.
    Wu Y; Lan Y; Zhang L; Xiang Y
    Neural Netw; 2023 Apr; 161():598-613. PubMed ID: 36822145
    [TBL] [Abstract][Full Text] [Related]  

  • 36. A Little Energy Goes a Long Way: Build an Energy-Efficient, Accurate Spiking Neural Network From Convolutional Neural Network.
    Wu D; Yi X; Huang X
    Front Neurosci; 2022; 16():759900. PubMed ID: 35692427
    [TBL] [Abstract][Full Text] [Related]  

  • 37. Improved Dynamic Graph Learning through Fault-Tolerant Sparsification.
    Zhu CJ; Storandt S; Lam KY; Han S; Bi J
    Proc Mach Learn Res; 2019 Jun; 97():7624-7633. PubMed ID: 35814489
    [TBL] [Abstract][Full Text] [Related]  

  • 38. Multidimensional Data Processing With Bayesian Inference via Structural Block Decomposition.
    Luo Q; Yang M; Li W; Xiao M
    IEEE Trans Cybern; 2024 May; 54(5):3132-3145. PubMed ID: 37022029
    [TBL] [Abstract][Full Text] [Related]  

  • 39. Theory-Inspired Deep Network for Instantaneous-Frequency Extraction and Subsignals Recovery From Discrete Blind-Source Data.
    Han N; Mhaskar HN; Chui CK
    IEEE Trans Neural Netw Learn Syst; 2021 Feb; PP():. PubMed ID: 33566766
    [TBL] [Abstract][Full Text] [Related]  

  • 40. On the combinatorics of sparsification.
    Huang FW; Reidys CM
    Algorithms Mol Biol; 2012 Oct; 7(1):28. PubMed ID: 23088372
    [TBL] [Abstract][Full Text] [Related]  

    [Previous]   [Next]    [New Search]
    of 6.