This project will develop a verbally interactive companion robot using an android type robot EveR4, which is a humanlike robot head system using more than 20 motors. We have the working robot that recognizes human face and voice and reacts (speak), but the current system uses simple methods. These days we have lots of new systems, intelligent speakers, eg. Alexa, and we will compare the existing systems to find better methods for our system. The missions are 1) to compare existing chatbots and intelligent speakers, 2) to find the best methods for our system, 3) design and develop a reaction system, and 4) to apply it to our robot.
A working interactive robot system and its SW. A report on the comparison.
Lab allocations have not been finalised