Will this artificial intelligence system keep your kindergarten toddlers safe?
Chinese team say their algorithm can analyse a live video stream and detect abnormal behaviour such as punching or slapping
A Chinese research team is preparing to roll out a prototype artificial intelligence system designed to catch acts of child abuse in kindergartens in real time.
The research has been under way for several years, but has attracted new interest following recent allegations of abuse at the RYB Education New World kindergarten in Beijing, part of an upmarket chain owned by a US-listed group.
Most kindergartens in China have installed surveillance cameras in their classrooms, but many are not actively monitored. Researchers say their AI algorithm could analyse a live video stream to track the movements of every pupil and teacher.
When abnormal behaviour – such as punching or slapping – is detected, the computer could send an alert to the kindergarten’s manager, government regulator or parents with a video clip for assessment.
The researchers are conducting extensive testing of the technology and expect to complete the prototype next year.
Rao Yan, an associate professor of computer science at Guizhou Minzu University in Guiyang and lead scientist on the project, said her team had encountered technological hurdles and funding shortages since the research began in 2013.
“This is one of those ‘minor areas’ in research which attract little money or attention,” Rao said.
While China has invested an enormous amount in artificial intelligence in recent years, most video analysis systems are currently used in national security, including defence and anti-terrorism activities.
But as a mother of a young child, Rao said she felt obliged to find a technical solution to the threat of child abuse.
“This is not just for my small child. We’re seeing this problem all over the country,” she said.
In China, many kindergarten teachers are young and underpaid, and some lack adequate training.
The RYB investigation is just one of a number of kindergarten abuse cases in recent years – some of which have been caught on surveillance camera – that have sparked safety concerns among parents and prompted the State Council to launch a nationwide inspection of preschools.
Rao and her colleagues began their research thinking it would be an easy job. There were many algorithms to detect violent, abnormal or “insane” behaviour. But they soon discovered the existing methods worked only for adults.
“The teacher is big and the student small. The computer easily lost track of the child when processing the video,” Rao said.
The children’s behaviour also caused a lot of confusion for the machine. They might jump up and down on chairs, chase each other with toy guns in hand, or sit in a corner for a long time playing with sand.
All of these activities are unusual for adults but normal for children – monitoring them using AI tools developed for grown-ups would constantly trigger the alarm.
So Rao developed a new algorithm for machine learning in a kindergarten setting. The team collected a large amount of surveillance footage from preschools and fed the data into a computer. At the start, the researchers had to tell the machine which behaviour was right and which was wrong, but later the computer learned to correctly assess the footage, with little human intervention.
The researchers are now testing the system on fresh video feeds from kindergartens, and they hope to give a public demonstration of the technology next year.
But Rao said the system still had much room for improvement.
“It is far from perfect. I hope more researchers can join this effort to accelerate the use of AI technology in the fight against child abuse,” she said.
Zhang Li, a professor at the department of electronic engineering at Tsinghua University, said theirs and other top AI labs in China could soon allocate resources to deal with the problem of child abuse.
Many airports, train stations and public areas in the country were already using the technology to detect abnormal behaviour in conjunction with methods such as facial recognition, according to Zhang.
But to make the technology work for kindergartens, researchers would need access to a large amount of data, and that could be a problem, he said.
“Some kindergartens might not be willing to share this because these videos may contain sensitive information. Some parents might also have privacy concerns,” Zhang said.
Huang Kaiqi, a researcher with the National Laboratory of Pattern Recognition at the Chinese Academy of Sciences in Beijing, said a computer could track movement or gestures by following each pixel in a human body image – so it could detect subtle behaviour such as a child being punished by being pricked with a needle.
“Abuse usually involves a series of actions. These actions can form a pattern, and that can be learned and identified by a machine,” he said.
Huang said other detection methods such as sound and heat sensing could also be used to gather more information so that the behaviour tracking system could make a more accurate judgment.
For instance, if children cried repeatedly following a certain action by the teacher, the system could flag an alert.
A senior police official with the Ministry of Public Security in Beijing, which oversees surveillance and alarm systems across the country, said it would consider adding artificial intelligence to its security system for schools including kindergartens.
Beijing issued a regulation in 2012 requiring all schools to install security cameras, but many of the systems were not monitored by dedicated staff and the footage was generally saved just for the record, according to the official, who declined to be named.
“Artificial intelligence has a pair of eyes that never blinks and a head that never gets tired,” he said.
But he added that technology alone could not solve the problem – abuse could still take place out of the range of security cameras.
“No amount of technology can fully protect our children,” he said.