![]() |
CrossRef Text and Data Mining |
Result of CrossRef Text and Data Mining Search is the related articles with entitled article. If you click link1 or link2 you will be able to reach the full text site of selected articles; however, some links do not show the full text immediately at now. If you click CrossRef Text and Data Mining Download icon, you will be able to get whole list of articles from literature included in CrossRef Text and Data Mining. |
Protected Health Information Recognition by Fine-Tuning a Pre-training Transformer Model |
Seo Hyun Oh, Min Kang, Youngho Lee |
Healthc Inform Res. 2022;28(1):16-24. Published online January 31, 2022 DOI: https://doi.org/10.4258/hir.2022.28.1.16 |
Protected Health Information Recognition by Fine-Tuning a Pre-training Transformer Model Investigation of Improving The Pre-Training And Fine-Tuning of BERT Model For Biomedical Relation Extraction Adversarial Robustness: From Self-Supervised Pre-Training to Fine-Tuning Investigation of improving the pre-training and fine-tuning of BERT model for biomedical relation extraction Improving Pre-Training and Fine-Tuning for Few-Shot SAR Automatic Target Recognition Transformer Model Fine-Tuning for Indonesian Automated Essay Scoring with Semantic Textual Similarity 2022 5th International Seminar on Research of Information Technology and Intelligent Systems (ISRITI). 2022; Knowledge Distillation from Bert in Pre-Training and Fine-Tuning for Polyphone Disambiguation Fine-Tuning Pre-trained Vision Transformer Model for Anomaly Detection in Video Sequences Robust Face Tracking Using Siamese-VGG with Pre-training and Fine-tuning Output Layer Go First: Better Fine-tuning by Bridging the Gap with Pre-training |