Our CLVR app can currently record the audio and video interviews on smartphones. It needs to be enhanced to support emotion extraction from the video content. Using Microsoft Cognitive AI or Watson services the emotions can be extracted. This functionality has multiple uses, such as in recruitment interviews process, healthcare and many more fields.
Following tasks need to be completed in this research project:
1. The current Android app records the videos of an interview and stores it in firebase. This needs to be passed to Microsoft or Watson cognitive services and get the results of the emotion extraction.
2. The current system generates the pdf of the emotion and personality stats. This project will capture the stats in the firebase and show them on a dashboard for clients.
3. Currently, the app processes the data sent by Watson on the phone. The architecture needs to be enhanced to handle data outside the app e.g. a webserver sitting in the cloud.
4. Can also develop our own API to map video to emotions if time permits. It will save API call costs to Watson and Microsoft.
Our current app is part of the IBM entrepreneurship program, so we can provide expert support on technology use.
See project related links:
Expected outcome: The end result will be an app that companies can use to view video interviews and see stats about personality and emotions of the interviewees.
Lab allocations have not been finalised