With the technology advancements we’ve seen over the last year, artificial intelligence, or AI, has undoubtedly emerged as a key player in the future of technology. And it’s progressing quickly. Today, devices equipped with AI can hear and see us, and in some cases, helps us make decisions. But in order to make machines truly helpful to humans tomorrow, says Dr. Rana el Kaliouby, they need to also sense non-verbal behavior in real time.

With a Ph.D. in computer vision and machine learning from University of Cambridge, and postdoctoral research at MIT Media Lab, Dr. el Kaliouby, inventor and CEO of Emotion AI tech company Affectiva, develops technology that recognizes patterns in human emotions and reacts accordingly. Already used by 1,400 brands ­­– including Amazon and Kellogg ­­– to understand audience reactions to content, Emotion AI opens exciting, new opportunities for using AI ranging from increasing safety in automobiles to personalizing learning experiences. In a recent interview for VentureBeat, the MIT scientist-turned-entrepreneur elaborated on how her work can be applied in a wide range of industries where AI has, until now, been largely ineffective.

Dr. el Kaliouby focuses on programming devices to learn to detect and respond to emotional cues through all our natural human tendencies, including our facial expressions, speech patterns, gestures and tone of voice. “Facial expressions are the building blocks of different emotional states,” she explains in VentureBeat, “so we can read over different tiny facial expressions, then combine these in different ways to represent seven different emotional states plus age, gender, and ethnicity.”

Best known for developing AI to sense emotions in videos, Affectiva, under Dr. el Kaliouby’s leadership, is evolving the technology to also detect emotion from audio cues. When we communicate, facial expressions make up 55% of our nonverbal communication, 35% of our understanding comes from how we say the words, and only 10% of our understanding comes from the actual words. Humans naturally combine all these different channels to accurately read other people’s emotion and cognitive states. Dr. el Kaliouby’s team is taking a similar approach, building multi-modal Emotion AI that combines these different channels of information.

Technology that humanizes AI will be especially in demand as humanity comes to terms with the arrival of intelligent machines in our daily lives. With AI expected to create as many new jobs as it will destroy, business leaders, health care practitioners and education providers have an immense opportunity to be the first in their industries to capitalize on the technology. And as Emotion AI plays a progressively greater role in society, utilizing emotion-infused technologies will be ever more important ­­– and lucrative.

Dr. el Kaliouby has bold predictions for AI in 2018. To see Emotion AI in action, watch how she teaches machines to feel.