107 related articles for article (PubMed ID: 35813017)
1. Role of Data Augmentation Strategies in Knowledge Distillation for Wearable Sensor Data.
Jeon ES; Som A; Shukla A; Hasanaj K; Buman MP; Turaga P
IEEE Internet Things J; 2022 Jul; 9(14):12848-12860. PubMed ID: 35813017
[TBL] [Abstract][Full Text] [Related]
2. Topological Knowledge Distillation for Wearable Sensor Data.
Jeon ES; Choi H; Shukla A; Wang Y; Buman MP; Turaga P
Conf Rec Asilomar Conf Signals Syst Comput; 2022; 2022():837-842. PubMed ID: 37583442
[TBL] [Abstract][Full Text] [Related]
3. Memory-Replay Knowledge Distillation.
Wang J; Zhang P; Li Y
Sensors (Basel); 2021 Apr; 21(8):. PubMed ID: 33921068
[TBL] [Abstract][Full Text] [Related]
4. Comparison of End-to-End Neural Network Architectures and Data Augmentation Methods for Automatic Infant Motility Assessment Using Wearable Sensors.
Airaksinen M; Vanhatalo S; Räsänen O
Sensors (Basel); 2023 Apr; 23(7):. PubMed ID: 37050833
[TBL] [Abstract][Full Text] [Related]
5. Topological Persistence Guided Knowledge Distillation for Wearable Sensor Data.
Jeon ES; Choi H; Shukla A; Wang Y; Lee H; Buman MP; Turaga P
Eng Appl Artif Intell; 2024 Apr; 130():. PubMed ID: 38282698
[TBL] [Abstract][Full Text] [Related]
6. Data-Efficient Sensor Upgrade Path Using Knowledge Distillation.
Van Molle P; De Boom C; Verbelen T; Vankeirsbilck B; De Vylder J; Diricx B; Simoens P; Dhoedt B
Sensors (Basel); 2021 Sep; 21(19):. PubMed ID: 34640843
[TBL] [Abstract][Full Text] [Related]
7. Constrained Adaptive Distillation Based on Topological Persistence for Wearable Sensor Data.
Jeon ES; Choi H; Shukla A; Wang Y; Buman MP; Turaga P
IEEE Trans Instrum Meas; 2023; 72():. PubMed ID: 38818128
[TBL] [Abstract][Full Text] [Related]
8. Feature Representation and Data Augmentation for Human Activity Classification Based on Wearable IMU Sensor Data Using a Deep LSTM Neural Network.
Steven Eyobu O; Han DS
Sensors (Basel); 2018 Aug; 18(9):. PubMed ID: 30200377
[TBL] [Abstract][Full Text] [Related]
9. Semantics-Aware Adaptive Knowledge Distillation for Sensor-to-Vision Action Recognition.
Liu Y; Wang K; Li G; Lin L
IEEE Trans Image Process; 2021; 30():5573-5588. PubMed ID: 34110991
[TBL] [Abstract][Full Text] [Related]
10. w-HAR: An Activity Recognition Dataset and Framework Using Low-Power Wearable Devices.
Bhat G; Tran N; Shill H; Ogras UY
Sensors (Basel); 2020 Sep; 20(18):. PubMed ID: 32962046
[TBL] [Abstract][Full Text] [Related]
11. Knowledge Distillation and Student-Teacher Learning for Visual Intelligence: A Review and New Outlooks.
Wang L; Yoon KJ
IEEE Trans Pattern Anal Mach Intell; 2022 Jun; 44(6):3048-3068. PubMed ID: 33513099
[TBL] [Abstract][Full Text] [Related]
12. Model-based data augmentation for user-independent fatigue estimation.
Jiang Y; Malliaras P; Chen B; Kulić D
Comput Biol Med; 2021 Oct; 137():104839. PubMed ID: 34520991
[TBL] [Abstract][Full Text] [Related]
13. DARE: Distill and Reinforce Ensemble Neural Networks for Climate-Domain Processing.
Xiang K; Fujii A
Entropy (Basel); 2023 Apr; 25(4):. PubMed ID: 37190431
[TBL] [Abstract][Full Text] [Related]
14. Highlight Every Step: Knowledge Distillation via Collaborative Teaching.
Zhao H; Sun X; Dong J; Chen C; Dong Z
IEEE Trans Cybern; 2022 Apr; 52(4):2070-2081. PubMed ID: 32721909
[TBL] [Abstract][Full Text] [Related]
15. Building a Compact Convolutional Neural Network for Embedded Intelligent Sensor Systems Using Group Sparsity and Knowledge Distillation.
Cho J; Lee M
Sensors (Basel); 2019 Oct; 19(19):. PubMed ID: 31590266
[TBL] [Abstract][Full Text] [Related]
16. Deep Transfer Learning for Time Series Data Based on Sensor Modality Classification.
Li F; Shirahama K; Nisar MA; Huang X; Grzegorzek M
Sensors (Basel); 2020 Jul; 20(15):. PubMed ID: 32751855
[TBL] [Abstract][Full Text] [Related]
17. Template-Driven Knowledge Distillation for Compact and Accurate Periocular Biometrics Deep-Learning Models.
Boutros F; Damer N; Raja K; Kirchbuchner F; Kuijper A
Sensors (Basel); 2022 Mar; 22(5):. PubMed ID: 35271074
[TBL] [Abstract][Full Text] [Related]
18. The effects of physics-based data augmentation on the generalizability of deep neural networks: Demonstration on nodule false-positive reduction.
Omigbodun AO; Noo F; McNitt-Gray M; Hsu W; Hsieh SS
Med Phys; 2019 Oct; 46(10):4563-4574. PubMed ID: 31396974
[TBL] [Abstract][Full Text] [Related]
19. Spot-Adaptive Knowledge Distillation.
Song J; Chen Y; Ye J; Song M
IEEE Trans Image Process; 2022; 31():3359-3370. PubMed ID: 35503832
[TBL] [Abstract][Full Text] [Related]
20. Deep neural network and data augmentation methodology for off-axis iris segmentation in wearable headsets.
Varkarakis V; Bazrafkan S; Corcoran P
Neural Netw; 2020 Jan; 121():101-121. PubMed ID: 31541879
[TBL] [Abstract][Full Text] [Related]
[Next] [New Search]