脑电图与睡眠分期
脑电图根据电极放置位置分为头皮电极脑电图、颅内电极脑电图。一般而言都是头皮电极脑电图。
头皮电极脑电图是从头皮上将脑部的自发性电活动加以放大记录而获得的图形。因为脑电信号非常微弱,为 mv 或 uv 级别,而且得经过颅骨和头皮的衰减,所以需要经过数百万倍的放大才能显示出来,经过滤波器(减少干扰)最后形成我们所看到的图形。
Original Paper Reference:HiddenCut: Simple Data Augmentation for Natural Language Understanding with Better Generalizability - ACL Anthology
Source Code: GT-SALT/HiddenCut (github.com)
Original Paper Reference:MixText: Linguistically-Informed Interpolation of Hidden Space for Semi-Supervised Text Classification (Chen et al., ACL 2020)
Original Paper Reference:An Empirical Survey of Data Augmentation for Limited Data Learning in NLP
这篇综述跟之前 NLP DA 综述的区别,abstract 部分说的很清楚:
we provide an empirical survey of recent progress on data augmentation for NLP in the limited labeled data setting, summarizing the landscape of methods (including token-level augmentations, sentence-level augmentations, adversarial augmentations and hidden-space augmentations) and carrying out experiments on 11 datasets covering topics/news classification, inference tasks, paraphrasing tasks, and single-sentence tasks.
Original Paper Reference:[1906.08988] A Fourier Perspective on Model Robustness in Computer Vision, NIPS 2019
这篇文章是从频域角度来分析模型的鲁棒性,为审视模型鲁棒性提供了一个新的角度。
Original Paper Reference:Text AutoAugment: Learning Compositional Augmentation Policy for Text Classification (Ren et al., EMNLP 2021)
这篇文章显然借鉴了图像领域的 Randaugment 论文 的思想,针对数据集和特定任务的数据增强策略寻找。
Original Paper Reference: Improving Robustness Without Sacrificing Accuracy with Patch Gaussian Augmentation
这篇文章提供了另一种数据增强的可能方式,将 gaussian 和 patch 结合,达到在不损失准确性的效果下提升模型的鲁棒性。
论文这两段关于深度学习鲁棒性研究的总结值得一读: