Human-Computer Interaction

Human-Computer Interaction (HCI) is concerned with the study, design, construction and implementation of human-centric interactive computer systems. Work in the Rochester HCI group (ROCHI) includes human nonverbal behavior analysis, social skills training, applied machine learning, educational technology, accessible computing, ubiquitous computing. Visit the HCI website.


Zhen Bai
Assistant Professor of Computer Science

Interests: Human-Computer Interaction; Augmented Reality; Tangible User Interfaces; Embodied Conversational Agent; Technology-Enhanced Learning; Computer-Supported Collaborative Work; Assistive Technology

Zhen Bai is an Assistant Professor in the Department of Computer Science at the University of Rochester. Her research focuses on creating embodied and intelligent interfaces that transcend learning, communication, and wellbeing for people with diverse abilities and backgrounds. Her main research fields include human-computer interaction, augmented reality, tangible user interface, embodied conversational agent, technology-enhanced collaborative learning, and assistive technology. Her work is published in premier human-computer interaction and learning science conferences such as CHI, ISMAR, IDC, IVA, and AIED.

Zhen received her Ph.D. degree from the Graphics & Interaction Group at Computer Laboratory, University of Cambridge in 2015 and was a postdoctoral fellow of the Human-Computer Interaction Institute and Language Technology Institute at Carnegie Mellon University before joining the University of Rochester.


M. Ehsan Hoque
Associate Professor of Computer Science

Interests: Human Computer Interaction; Artificial Intelligence ; Interactive machine learning; Health and wellbeing; Future of skills

Ehsan Hoque is an assistant professor in the Department of Computer Science at the University of Rochester. His research focuses on (1) designing and implementing new algorithms to sense subtle human nonverbal behavior, (2) enabling new behavior sensing for human-computer interaction, and (3) inventing new applications of emotion technology in high-impact social domains such as social skills training, public speaking, and assisting individuals who experience difficulties with social interactions.

Hoque received his PhD in 2013 from the Massachusetts Institute of Technology. His PhD thesis — “Computers to Help with Conversations: Affective Framework to Enhance Human Nonverbal Skills” — was the first of its kind to demonstrate that it is possible for people to learn and improve their social skills by interacting with an automated system; the thesis was showcased at the MIT Museum as one of the most unconventional inventions at MIT. Dr. Hoque has won a number of awards, including the IEEE Gold Humanitarian Fellowship, Best Paper Award at Ubiquitous Computing (UbiComp), Best Paper Nominations at Automated Face and Gesture Recognition (FG), and Intelligent Virtual Agents (IVA), and NSF CRII (Pre-CAREER) Award.


Yukang Yan
Assistant Professor of Computer Science

Interests: Human-Computer Interaction; Virtual/Augmented Reality; Human Behaviour Modelling

Yukang Yan is an Assistant Professor in the ROCHCI group in the Department of Computer Science, University of Rochester. Before that, he was a postdoctoral researcher in the Augmented Perception Lab, Human-Computer Interaction Institute at Carnegie Mellon University. He earned the Ph.D. degree and Bachelor’s degree from Tsinghua University. His research focus lies in Human-Computer Interaction and Mixed Reality. He published journals and papers at ACM CHI, ACM UIST, ACM IMWUT, IEEE VR, with ACM CHI 2020 Honorable Mention Award, ACM CHI 2023 Honorable Mention Award, IEEE VR Best Paper Nominee Award. His thesis won Outstanding Doctoral Thesis at Tsinghua University.

His research is focused on understanding, predicting and enhancing user behaviour in Mixed Reality, which follows three connected threads: capture the user’s behavioural and perceptual patterns with computational methods; develop input techniques and adaptive user interfaces to facilitate human-computer two-way communication; explore unique behavioural enhancements enabled by Mixed Reality.