![]() |
CrossRef Text and Data Mining |
Result of CrossRef Text and Data Mining Search is the related articles with entitled article. If you click link1 or link2 you will be able to reach the full text site of selected articles; however, some links do not show the full text immediately at now. If you click CrossRef Text and Data Mining Download icon, you will be able to get whole list of articles from literature included in CrossRef Text and Data Mining. |
Protected Health Information Recognition by Fine-Tuning a Pre-training Transformer Model |
Seo Hyun Oh, Min Kang, Youngho Lee |
Healthc Inform Res. 2022;28(1):16-24. Published online January 31, 2022 DOI: https://doi.org/10.4258/hir.2022.28.1.16 |
Protected Health Information Recognition by Fine-Tuning a Pre-training Transformer Model Investigation of Improving The Pre-Training And Fine-Tuning of BERT Model For Biomedical Relation Extraction Adversarial Robustness: From Self-Supervised Pre-Training to Fine-Tuning Investigation of improving the pre-training and fine-tuning of BERT model for biomedical relation extraction Knowledge Distillation from Bert in Pre-Training and Fine-Tuning for Polyphone Disambiguation A multilingual offensive language detection method based on transfer learning from transformer fine-tuning model Robust Face Tracking Using Siamese-VGG with Pre-training and Fine-tuning Output Layer Go First: Better Fine-tuning by Bridging the Gap with Pre-training Training Deep Spiking Convolutional Neural Networks With STDP-Based Unsupervised Pre-training Followed by Supervised Fine-Tuning Fine-Tuning of Pre-Trained End-to-End Speech Recognition with Generative Adversarial Networks ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). 2021; |