Computers can Tell if You're Bored

Computers are able to read a person's body language to tell whether they are bored or interested in what they see on the screen, according to a new study led by body-language expert Dr Harry Witchel, Discipline Leader in Physiology at Brighton and Sussex Medical School (BSMS).

The research shows that by measuring a person's movements as they use a computer, it is possible to judge their level of interest by monitoring whether they display the tiny movements that people usually constantly exhibit, known as non-instrumental movements.

If someone is absorbed in what they are watching or doing - what Dr Witchel calls 'rapt engagement' - there is a decrease in these involuntary movements.

Dr Witchel said: "Our study showed that when someone is really highly engaged in what they're doing, they suppress these tiny involuntary movements. It's the same as when a small child, who is normally constantly on the go, stares gaping at cartoons on the television without moving a muscle.

The discovery could have a significant impact on the development of artificial intelligence. Future applications could include the creation of online tutoring programmes that adapt to a person's level of interest, in order to re-engage them if they are showing signs of boredom. It could even help in the development of companion robots, which would be better able to estimate a person's state of mind.

Also, for experienced designers such as movie directors or game makers, this technology could provide complementary moment-by-moment reading of whether the events on the screen are interesting. While viewers can be asked subjectively what they liked or disliked, a non-verbal technology would be able to detect emotions or mental states that people either forget or prefer not to mention.

"Being able to 'read' a person's interest in a computer program could bring real benefits to future digital learning, making it a much more two-way process," Dr Witchel said. "Further ahead it could help us create more empathetic companion robots, which may sound very 'sci fi' but are becoming a realistic possibility within our lifetimes."

In the study, 27 participants faced a range of three-minute stimuli on a computer, from fascinating games to tedious readings from EU banking regulation, while using a handheld trackball to minimise instrumental movements, such as moving the mouse. Their movements were quantified over the three minutes using video motion tracking. In two comparable reading tasks, the more engaging reading resulted in a significant reduction (42%) of non-instrumental movement.

The study team also included two of Dr Witchel's team, Carlos Santos and Dr James Ackah, media expert Carina Westling from the University of Sussex, and the clinical biomechanics group at Staffordshire University led by Professor Nachiappan Chockalingam.

BSMS is a partnership between the Universities of Sussex and Brighton together with NHS organisations throughout the south-east region.

Most Popular Now

Open Medical Works with Moray's Dig…

Open Medical is working with the Digital Health & Care Innovation Centre’s Rural Centre of Excellence on a referral management plan, as part of a research and development scheme to...

Generative AI on Track to Shape the Futu…

Using advanced artificial intelligence (AI), researchers have developed a novel method to make drug development faster and more efficient. In a new paper, Xia Ning, lead author of the study and...

Personalized Breast Cancer Prevention No…

A new telemedicine service for personalised breast cancer prevention has launched at preventcancer.co.uk. It allows women aged 30 to 75 across the UK to understand their risk of developing breast...

New App may Help Caregivers of People Ge…

A new study by investigators from Mass General Brigham showed that a new app they created can help improve the quality of life for caregivers of patients undergoing bone marrow...

An App to Detect Heart Attacks and Strok…

A potentially lifesaving new smartphone app can help people determine if they are suffering heart attacks or strokes and should seek medical attention, a clinical study suggests. The ECHAS app (Emergency...

A Machine Learning Tool for Diagnosing, …

Scientists aiming to advance cancer diagnostics have developed a machine learning tool that is able to identify metabolism-related molecular profile differences between patients with colorectal cancer and healthy people. The analysis...

Fine-Tuned LLMs Boost Error Detection in…

A type of artificial intelligence (AI) called fine-tuned large language models (LLMs) greatly enhances error detection in radiology reports, according to a new study published in Radiology, a journal of...

DeepSeek-R1 Offers Promising Potential t…

A joint research team from The Hong Kong University of Science and Technology and The Hong Kong University of Science and Technology (Guangzhou) has published a perspective article in MedComm...

Deep Learning can Predict Lung Cancer Ri…

A deep learning model was able to predict future lung cancer risk from a single low-dose chest CT scan, according to new research published at the ATS 2025 International Conference...