Leah Ferentinos/Contributing Videographer Pictured is a 3-D model of a human face. A Binghamton University professor is attempting to develop software that can detect emotion in human facial expressions.
Close

A professor at Binghamton University is trying to find a universal algorithm that could create a digital reading of the human face and be able to identify specific human emotions.

Lijun Yin, an associate professor of computer science, has been sponsored by the National Science Foundation, the Air Force Research Laboratory and the New York State Office of Science, Technology and Academic Research to develop and refine facial recognition technologies.

“We have developed several different things in the past few years regarding a camera’s ability to analyze the human face,” Yin explained. “Working with [BU associate professor of psychology] Peter Gerhardstein, we are trying to get the computer to distinguish between basic human emotions consistently and reliably.”

This technology will have applications in the future development of everything from hospital rooms to PowerPoint presentations.

According to Yin, the possible applications are widespread. He and his colleagues have undertaken testing to examine using cameras in a hospital room to measure a patient’s pain or operating a basic computer without the use of a mouse or keyboard.

He said getting a computer to understand emotion is a complex challenge.

“Even humans have significant difficulty being able to visually identify the emotional state of their counterparts quickly and effectively,” Yin said. “Different people display emotions differently, and a computer can only understand a certain pattern.”

He added that while his laboratory results have been very encouraging, successfully transferring such technologies to the commercial market will be an enormous task.

“Many scientists are working on similar issues, [but] we are unique in our three-dimensional approach,” Yin said.

His laboratory uses six standard cameras and a multitude of sophisticated software to analyze a human face.

Jonathan McMahon, a freshman majoring in integrative neuroscience, discussed what he saw as the challenges of creating such a technology.

“I don’t think computers [will] be able to efficiently understand the human element,” he said. “There are so many differences between people that it seems impossible.”

Yin acknowledged the difficulties but remained confident in the future success of his project.

“There is still a long way to go, just because something happens in the lab does not mean it will suddenly appear in the commercial market. We have merely proved the concept. Microsoft was developing the Kinect camera for years before they formally introduced it. Furthermore, it only looks through a two-dimensional lens and has very basic capabilities,” he said, comparing it to his own three-dimensional technology.