Healthc Inform Res Search

CLOSE


Healthc Inform Res > Volume 29(4); 2023 > Article
Kyyhkynen, Peltonen, and Smed: Videoconferencing Applications for Training Professionals on Nonverbal Communication in Online Clinical Consultations

Abstract

Objectives

The use of videoconferencing technologies for clinician-patient online consultations has become increasingly popular. Training on online communication competence through a videoconferencing application that integrates nonverbal communication detection with feedback is one way to prepare future clinicians to conduct effective online consultations. This case report describes and evaluates two such applications designed for healthcare professionals and students in healthcare-related fields.

Methods

We conducted a literature review using five databases, including the Web of Science, Scopus, PubMed, ACM, IEEE, and CINAHL in the spring of 2022.

Results

We identified seven studies on two applications, ReflectLive and EQClinic. These studies were conducted by two research groups from the USA and Australia and were published between 2016 and 2020. Both detected nonverbal communication from video and audio and provided computer-generated feedback on users’ nonverbal communication. The studies evaluated usability, effectiveness in learning communication skills, and changes in the users’ awareness of their nonverbal communication. The developed applications were deemed feasible. However, the feedback given by the applications needs improvement to be more beneficial to the user. The applications were primarily evaluated with medical students, with limited or no attention given to questions regarding ethics, information security, privacy, sustainability, and costs.

Conclusions

Current research on videoconferencing systems for training online consultation skills is very limited. Future research is needed to develop more user-centered solutions, focusing on a multidisciplinary group of students and professionals, and to explore the implications of these technologies from a broader perspective, including ethics, information security, privacy, sustainability, and costs.

I. Introduction

Videoconferencing has grown in popularity in recent years due to improved network speed and advancements in videoconferencing technologies. This trend was accelerated further by the coronavirus disease 2019 (COVID-19) pandemic internationally [1]. Telehealth refers to the delivery of a wide range of remote health services using technology, such as conducting online clinical consultations. Telehealth has been shown to improve access to healthcare [2]. Despite the potential benefits of telehealth, there has been a limited focus on preparing clinicians for online consultations [3,4].
A clinician’s compassionate nonverbal communication increases patient satisfaction related to the care provided [5]. Guidelines for clinicians’ online consultations have been developed to maximize the empathy and rapport experienced by patients [6]. However, learning to be compassionate in an online consultation situation is difficult and time-consuming, as it requires monitoring and assessing the communication to address potential areas for improvement.
Recent advances in machine learning have made it possible to detect nonverbal communication during videoconferences. This technology can be used to provide automatic computer-generated feedback in online consultations and potentially improve the communication skills of students and healthcare professionals [711].
The aim of this case report is to describe and evaluate two videoconferencing applications—namely, ReflectLive and EQClinic—identified through a literature review. These applications, which are designed for healthcare students and professionals, detect nonverbal communication, and provide feedback on it during simulated online consultations with patients.

II. Methods

We conducted a literature review by searching five databases (Web of Science, Scopus, PubMed, ACM, IEEE, and CINAHL) for peer-reviewed articles related to videoconferencing applications with nonverbal communication detection and a feedback option developed for healthcare professionals or students pursuing a healthcare-related degree, such as medical or nursing degrees. We employed the PICo (population, phenomenon of interest, and context) strategy to identify relevant search terms. The search was done during the spring of 2022, using the title, abstract, and keyword fields. We also reviewed the references of the included articles.

III. Results

We identified seven peer-reviewed articles reporting two applications: ReflectLive [11] and EQClinic [710,12,13]. We described these systems’ basic characteristics, technology used for the detection, users and settings of implementation, form of feedback provided by the system, and outcomes of evaluations of effectiveness. The summarized results are presented in Table 1.
  • ReflectLive is a videoconferencing application that detects clinicians’ nonverbal communication during online consultations with patients. It provides real-time formative feedback and summative feedback after the clinical consultation [11].

  • EQClinic is a videoconferencing platform that detects nonverbal communication and provides feedback on a student’s nonverbal communication during a clinical consultation session with a simulated patient [710,12,13]. EQClinic has also been tested to identify nonverbal mimicking behavior between students and the simulated patient [12,13].

1. Basic Characteristics of Studies Included in the Evaluation

The study reporting the ReflectLive application was conducted in the United State in 2017. It involved a group of experts in computer science and human-computer interaction [11]. The six EQClinic research studies, conducted in Australia between 2016 and 2020 [710,12,13], involved multidisciplinary teams of experts in computer science [710,12,13], human-computer interaction [710,12,13], medicine [710,13], education [7,9,10,13], and psychology [13].

2. Nonverbal Communication Detection

ReflectLive and EQClinic both use video to detect nonverbal communication. However, they differ in terms of what is detected from the video image. EQClinic detects facial expressions and head and body movements, while ReflectLive focuses on detecting how the user is positioned in the image, such as centeredness on the screen and gaze. Both systems use audio and video to detect nonverbal communication. They detect conversational aspects of speech, including turn-taking, overlaps, and speaking contributions. EQClinic also detects users’ volume and pitch variations [710]. EQClinic relies exclusively on video to detect the mimicry between users [12,13].

3. Users and Settings

ReflectLive was tested with medical doctors and a psychologist, and EQClinic was tested with medical students. The context of use is similar for both applications. Both are designed to practice nonverbal communication during a videoconference with a simulated patient in online clinical setting and to increase awareness of the importance of nonverbal communication during clinical consultations.

4. Feedback Provided by Systems

The applications provide computer-generated feedback in two ways: real-time formative and summative feedback. Real-time formative feedback is given during an ongoing consultation. In ReflectLive, real-time feedback sends an alert if the user is not centered on the screen, or if the user has interrupted the patient [11]. The alert is provided with visual feedback, such as the user's thumbnail being highlighted in red. Challenges with real-time formative feedback have been reported, including that it is distracting, and even more so when it gives false alerts [11].
Summative feedback is a set of visualized information that informs the user on how they have used nonverbal communication during the session. In EQClinic, the summative feedback includes both computer-generated feedback and commentary from the simulated patient during the session [710,12,13]. The computer-generated feedback includes simple graphs showing factors, such as the user’s speaking ratio or smile intensity over specific periods. Simulated patients can provide feedback during the session by using the thumbs up or down icons and providing brief comments. ReflectLive summarizes the real-time feedback for summative feedback [11].

5. Evaluation

We collected information on the feasibility and usefulness of the applications, such as how well the nonverbal communication detection functions, whether users have improved their communication skills through the use of the system, and what users think about the feedback provided. The study on the ReflectLive application reported the application's feasibility and usability for online clinical consultations and its impact on users’ nonverbal communication during and after consultations through semi-structured interviews [11]. The authors concluded that the application can be useful for monitoring and providing feedback on nonverbal communication in online clinical settings.
EQClinic has been evaluated regarding its usability, students’ learning, and the accuracy of the nonverbal communication detection [710]. These outcomes were based on usability scores from surveys [810] and an adapted version of the Student-Patient Observed Communication Assessment instrument [79,12,13]. The students’ awareness of their nonverbal communication was compared using pre- and posttest surveys [710]. Mimicry detection has been evaluated for its potential to improve students’ communication skills [12,13]. Ethical and privacy concerns were raised only in the ReflectLive study, which reported on clinicians’ acceptance of an application that tracks their nonverbal communication [11]. Some users expressed concern about potential misuse, while others anticipated using it at every future videoconference. Information security, sustainability, and costs were not addressed in any of the articles reviewed.

IV. Discussion

We identified and evaluated two videoconferencing applications—namely, ReflectLive and EQClinic—that have been designed to detect nonverbal communication and give feedback on it to the user during and after a simulated online consultation. To our knowledge, there are no other reviews of these aspects. The main takeaway from this case report is that the solutions presented are deemed feasible and users may benefit from them by learning to pay more attention to their nonverbal communication. However, studies evaluating the performance and impact of these applications are scarce, and those that exist are mainly descriptive.
Evaluating nonverbal communication in online consultations is a complicated task. For example, what does gaze and head orientation detection convey to the user? During clinical encounters, it is necessary to occasionally look away from the screen, such as when taking notes. Furthermore, as pointed out by Liu et al. [10], the systems developed can only identify nonverbal communication, but not evaluate the behavior. The applications provide a descriptive summary of the user’s communication, but not the feedback they might want, such as how to improve their communication. These are examples of questions that need to be explored in more depth in the future.
Ethical and privacy issues were only briefly discussed by Faucett et al. [11], and none of the articles addressed information security, sustainability, or economic issues. These issues warrant much more attention in future research and for the implementation of safe and sustainable videoconferencing technologies to learn nonverbal communication.
A limitation of this study is that our inclusion criteria for peer-reviewed research may have resulted in the exclusion of some relevant literature. Hence, future studies should consider broader search strategies and literature beyond peer-reviewed articles.
Universities have recognized the importance of educating students about digital health services [14]. This has spurred interest in developing innovative solutions to improve the videoconferencing experience with patients and offer users valuable information about their communication during online clinical consultations. Systems for detecting and providing feedback on the nonverbal communications of students and professionals during online consultations are still in their infancy. The idea is promising, but to create a practical design for the application, collaboration among stakeholders from diverse disciplines is essential. Relying solely on medical doctors or students as test users may limit the perspective, given the variety of professionals using online consultations. Our review only covered reports on the two studied technologies, leaving it an open question whether others have been scientifically evaluated.

Notes

Conflict of Interest

No potential conflict of interest relevant to this article was reported.

Table 1
Characteristics of ReflectLive and EQClinic and findings of the included studies
Study Name Country Detection User group Primary data Feedback Evaluation
Faucett et al. [11] ReflectLive USA Video: centeredness on screen, gaze.
Audio: speaking contribution, conversational overlaps.
9 medical doctors, 1 psychologist Semi-structured interview to understand the feasibility and usability.
Collected NVC analytics
Real-time: speaking contribution, interruption count, whether the user watched the screen.
Summative: speaking contribution, interruptions.
The application can be used to inform clinicians of their NVC during clinician-patient communication and give feedback for the clinician to change or become aware of their NVC.
Wu et al. [12] EQClinic Australia NVC mimicry.
Video: smiling, frowning, nodding, head shaking.
91 second-year medical students and 2 SPs SOCA assessment completed by the SPs compared with NVC mimicry analytics. Automatic NVC summative feedback and SOCA scores from SPs. Medical students can use the application to improve clinician-patient communication in an online clinical setting.
Wu et al. [13] EQClinic Australia NVC mimicry.
Video: smiling, frowning, nodding, head shaking
130 second-year medical students and 29 SPs SOCA assessment completed by the SPs compared with the NVC mimicry analytics. Automatic NVC summative feedback and SOCA assessment from SPs. The study provided evidence that NVC mimicry detection can impact communication outcomes in telehealth. It may be helpful for people who are involved in communication training and the development of a digital educational platform
Liu et al. [10] EQClinic Australia Video: nodding, head shaking and tilting, smiling, frowning, body leaning, face-touch gesture.
Audio: volume, pitch, turn-taking, speaking ratio.
8 medical students in years 1–4 and 3 SPs Pre- and post-consultation surveys to measure differences in student confidence and learning aspects such as NVC awareness.
Usability scores
Automated NVC summative feedback and SP comments. The system is feasible, stable, and usable. It has the potential to teach NVC skills to medical students, help automatically identify students' NVC, and make it easier to organize meetings with SPs.
Liu et al. [9] EQClinic Australia Video: nodding, head shaking and tilting, smiling, frowning, body leaning.
Audio: volume, pitch, turn taking, speaking ratio.
135 second-year medical students and 83 SPs, 28 tutors SOCA assessment by the SP after each consultation.
Student awareness of NVC before and after clinical consultations, measured by pre- and post-use assessments.
Usability scores
Automatic NVC summative feedback and SOCA assessment and comments from SPs. Medical students can use this system to become more aware of their NVC during consultations, thereby improving students' communication skills
Liu et al. [8] EQClinic Australia Video: nodding, head shaking and tilting, smiling, frowning, body leaning, face-touch gestures.
Audio: volume, pitch, turn taking, speaking ratio
18 medical students in years 1–4 and three SPs Comparison of pre- and post-consultation surveys indicating student learning.
Usability scores
Automatic NVC summative feedback and SOCA assessment and comments from SPs. Medical students can use this system to train and learn NVC. In addition, others who find NVC important can also use it to learn these aspects of communication
Liu et al. [7] EQClinic Australia Video: nodding, head shaking and tilting, smiling, frowning, body leaning, hand gestures.
Audio: volume, pitch, turn taking, speaking ratio.
268 second-year medical students SOCA scores were collected from tutors observing F2F consultation and from SP in EQClinic. In addition, tutor-rated SOCA scores from two F2F consultations were compared. Automatic NVC summative feedback and SOCA assessment by SPs in consultation and tutors in F2F consultation. Medical students can use this system to train and learn NVC in a clinical setting with SPs

NVC: nonverbal communication, SP: simulated patient, SOCA: Student-Patient Observed Communication Assessment, F2F: face-to-face.

References

1. Kissi J, Annobil C, Mensah NK, Owusu-Marfo J, Osei E, Asmah ZW.. Telehealth services for global emergencies: implications for COVID-19: a scoping review based on current evidence. BMC Health Serv Res 2023 23(1):567.https://doi.org/10.1186/s12913-023-09584-4
crossref pmid pmc
2. Gajarawala SN, Pelkowski JN.. Telehealth Benefits and Barriers. J Nurse Pract 2021 17(2):218-21. https://doi.org/10.1016/j.nurpra.2020.09.013
crossref pmid
3. Budakoglu Iİ, Sayilir MU, Kiyak YS, Coskun O, Kula S.. Telemedicine curriculum in undergraduate medical education: a systematic search and review. Health Technol (Berl) 2021 11(4):773-81. https://doi.org/10.1007/s12553-021-00559-1
crossref pmid pmc
4. Edirippulige S, Armfield NR.. Education and training to support the use of clinical telehealth: a review of the literature. J Telemed Telecare 2017 23(2):273-82. https://doi.org/10.1177/1357633X16632968
crossref pmid
5. Henry SG, Fuhrel-Forbis A, Rogers MA, Eggly S.. Association between nonverbal communication during clinical interactions and outcomes: a systematic review and meta-analysis. Patient Educ Couns 2012 86(3):297-315. https://doi.org/10.1016/j.pec.2011.07.006
crossref pmid
6. Gustin T. Telemedicine etiquette. In: Atanda A, Lovejoy JF, editors. Telemedicine in Orthopedic Surgery and Sports Medicine. Cham, Switzerland: Springer; 2021. p. 65-80. https://doi.org/10.1007/978-3-030-53879-8_6
crossref
7. Liu C, Lim RL, McCabe KL, Taylor S, Calvo RA.. A web-based telehealth training platform incorporating automated nonverbal behavior feedback for teaching communication skills to medical students: a randomized crossover study. J Med Internet Res 2016 18(9):e246.https://doi.org/10.2196/jmir.6299
crossref pmid pmc
8. Liu C, Calvo RA, Lim R.. Improving medical students’ awareness of their non-verbal communication through automated non-verbal behavior feedback. Front in ICT 2016 3:11.https://doi.org/10.3389/fict.2016.00011
crossref
9. Liu C, Calvo RA, Lim R, Taylor S. EQClinic: a platform for improving medical students’ clinical communication skills. In: Yin X, Geller J, Li Y, Zhou R, Wang H, Zhang Y, editors. Health Information Science. Cham, Switzerland: Springe; 2016. p. 73-84. https://doi.org/10.1007/978-3-319-48335-1_8
crossref
10. Liu C, Scott KM, Lim RL, Taylor S, Calvo RA.. EQClinic: a platform for learning communication skills in clinical consultations. Med Educ Online 2016 21:31801.https://doi.org/10.3402/meo.v21.31801
crossref pmid
11. Faucett HA, Lee ML, Carter S.. I should listen more: real-time sensing and feedback of non-verbal communication in video telehealth. Proc ACM Hum Comput Interact 2017 1(CSCW44):1-19. https://doi.org/10.1145/3134679
crossref
12. Wu K, Liu C, Taylor S, Atkins PW, Calvo RA. Automatic mimicry detection in medical consultations. Proceedings of 2017 IEEE Life Sciences Conference (LSC); 2017 Dec 13–15. Sydney, Australia; p. 55-8. https://doi.org/10.1109/LSC.2017.8268142
crossref
13. Wu K, Liu C, Calvo RA.. Automatic nonverbal mimicry detection and analysis in medical video consultations. Int J Hum Comput Interact 2020 36(14):1379-92. https://doi.org/10.1080/10447318.2020.1752474
crossref
14. Jonas CE, Durning SJ, Zebrowski C, Cimino F.. An interdisciplinary, multi-institution telehealth course for third-year medical students. Acad Med 2019 94(6):833-7. https://doi.org/10.1097/ACM.0000000000002701
crossref pmid
TOOLS
Share :
Facebook Twitter Linked In Google+ Line it
METRICS Graph View
  • 0 Crossref
  •     Scopus
  • 1,072 View
  • 136 Download
Related articles in Healthc Inform Res


ABOUT
ARTICLE CATEGORY

Browse all articles >

BROWSE ARTICLES
FOR CONTRIBUTORS
Editorial Office
1618 Kyungheegung Achim Bldg 3, 34, Sajik-ro 8-gil, Jongno-gu, Seoul 03174, Korea
Tel: +82-2-733-7637, +82-2-734-7637    E-mail: hir@kosmi.org                

Copyright © 2024 by Korean Society of Medical Informatics.

Developed in M2community

Close layer
prev next