Emotion detection in real-time
VERN Health analyzes the emotions in communications from
your patients, looking for discrete emotional clues in real-time. That means
that VERN Health can be used as a tool for analysis that can help you make a
diagnosis quicker. Our voice, word, and facial expression analysis works in
real-time.
Here are some frequently asked questions:
We’re averaging a return of about 0.06 seconds, which is really fast. Adding voice and other analysis to VERN Health may slow down the analysis some, but it should still be fast enough to deliver results. The biggest limitation for VERN Health’s speed is really the quality of the network your VERN Health instance is on.
ContactsYou can use the real time features in an exam, as long as you have an interface running in between the patient and the provider. A provider will have to use the camera on the device in order to use any visual analysis, and the provider will have to keep the microphone free from obstruction in order to properly attain the audio data.
We are working on more tools for more applications, so if you have a request for a product, please get a hold of us here.
VERN Health provides you analysis of the emotions in the conversation in real-time, and displays on screen the emotions detected, and the percentage of confidence/strength. As someone speaks, the analysis will change on-screen as you go through the conversation.
At the end of each interaction, a transcribed report will be provided to you that is scored by VERN AI and any additional partner application analysis you have chosen. You’ll get an easy to read, easy to use insight into the patient interaction for filing in your EHR or PM software of choice.