IoT Security

Smart Toys and Their Cybersecurity Risks: Are Our Toys Becoming a Sci-Fi Nightmare? [updated 2021]

Susan Morrow
May 4, 2021 by
Susan Morrow

In the “Living Doll” episode of “The Twilight Zone,” a lifelike doll named Talky Tina turned rogue and terrorized a family. This nightmare scenario of out-of-control toys is sometimes portrayed in sci-fi novels and films — but are we entering the Twilight Zone with modern-day smart toys? 

The Internet of Things (IoT) has entered our lives and work almost seamlessly. Often thought of as for use in medicine and industry, consumer IoT devices (C-IoT) have captured the imagination of the public. From smartphones to heating controls, the IoT quietly permeates our lives. Sensors connect everything across an IoT ecosystem, which uses the data we generate to make the wheels of connectivity turn. Predictions on the number of connected IoT devices range from 28 billion to over 50 billion by the end of 2021, each of those collecting and analyzing sensitive data. 

The march of progress can be a wonderful thing. It’s exciting, and in the case of consumer IoT products, nothing is without its IoT version. One of the issues is that these devices can bleed from the home into the office. This is especially true of remote working. In a survey, 41% of C-IoT devices being used for corporate use outside the office were not secured using SSL.

One area of the C-IoT that is finding a “home in our homes” is the smart toy, adding more fun into a kid’s day. And this integration into our everyday lives looks set to continue with the market for IoT-connected smart toys expected to be worth over $5.6 billion by 2027.

But is the promise of such clever toys too good to be true? Will the sweetness of that smart toy carry a sting caused by cybersecurity vulnerabilities?

Learn IoT Security

Learn IoT Security

Learn how ethical hackers exploit the growing number of internet-connected devices and become a Certified IoT Security Practitioner.
 

When good toys go bad

Smart toys are a potential minefield in terms of security and privacy. Since the advent of the smart toy back in 2015 — Hello Barbie being one of the first AI-enabled toys — privacy fears have risen across the industry. Connected toys connect data and events using Bluetooth, Wi-Fi, the cloud and mobile apps. They often contain microphones and cameras, collecting visual and audio data. Later versions of smart toys may even contain facial recognition technology augmented using artificial intelligence. The data that feed the toys are collected, stored and shared across those connections: and security worries abound with reports finding that up to 98% of IoT traffic is unencrypted.

In January 2018, the first smart toy privacy case was brought to court under the 

The Federal Trade Commission (FTC) Children's Online Privacy Protection (COPPA) rule. COPPA deals with the protection of personal information of children under 13. VTech Electronics, Ltd, which provides several smart toys, were held accountable under several points. These included not providing a clear privacy policy on its website and being negligent in securing the personal data collected from children. 

In 2019, the FTC put out a warning to parents on the dangers of smart toys. This warning pointed out several things to look for when buying a smart toy:

  • Does the toy come with a camera or microphone? What will it be recording, and will you know when the camera or microphone is on?
  • Does the toy let your child send emails or connect to social media accounts?
  • Can parents control the toy and be involved in its setup and management? What controls and options does it have? What are the default settings?

The FTC notice goes on to talk about the type of data collected and the storage and sharing practices of the toy. The key areas to focus on cover a wide range of potential security and privacy vulnerabilities. 

Here are some examples of good toys turning bad:

It’s all in the design: Nurserycam

Whilst not a toy, Nurserycam is a connected device with children in mind. This webcam is typically used in nursery schools to keep an eye on the kids. The device, however, was deemed to be less than secure as it allowed anyone using the associated mobile app to have access to admin credentials. This poor attention to design detail creates gaping security holes. Default passwords and lack of access control roles in IoT devices are not limited to devices meant for children. Bruce Schneier posted recently about a list of Telnet credentials for more than 515,000 servers, home routers and IoT devices for sale on the dark web.

Seven toys go bad: An analysis by NCC Group

The UK’s National analyzed seven smart toys looking for security and privacy flaws. Toys tested included the Boxer Robot, an AI-enabled robot, Bloxels, Sphero Mini and the Singing Machine. All were found to have security issues. The focus of the tests included confidentiality and integrity of any personal data captured and processed by the toys. Even though full security testing was outside of the scope of the tests the researchers observed the following:

  • No encryption was used to protect data during account creation and login. This left the data open to man-in-the-middle (MitM) attacks
  • There was poor execution of account recovery that left the account open to brute-force attacks
  • Weak password policies were evident, including one allowing the use of “password” as a password

Artificial intelligence and truly smart toys

AI-enabled toys are beginning to become popular. There are, of course, ethical issues around the implementation of AI in our children’s toys that ask questions about cognitive development and civil issues. A recent example is a toy in China that teaches children how to pay bills using facial recognition. The overlap of children’s cognitive development and advanced technologies is likely to open up new challenges that draw in from privacy, security and children’s rights. A UNICEF publication on artificial intelligence and children’s rights shows concerns about privacy laws keeping up with the technology, stating that “existing privacy laws and common law tort duties fall short of providing directly relevant protection.”

The dovetailing of AI with security flaws compounds safeguarding issues for children. Having natural language processing and machine learning to make toys even more realistic is a goal of manufacturers. This ultra-realism could potentially enhance and augment any security flaws. Imagine a malicious entity hacking a toy that was poorly-protected and talking to a child, with the child being unable to discern between the realistic toy conversation and that of the hacker.

The design problem in smart toys

Security and privacy should always be by design. The IoT is not an isolated system; the connectivity means a smart toy is part of a massive, interwoven, matrix. System design should be an exercise that takes a 360-degree view of security and privacy, from the front user-facing end to the data lifecycle through the back end of the system. A defense-in-depth approach to security fits the IoT model well. And just because something is a toy does not mean that security should be an afterthought.

Learn IoT Security

Learn IoT Security

Learn how ethical hackers exploit the growing number of internet-connected devices and become a Certified IoT Security Practitioner.

What is being done to protect our kids?

Fortunately, there are initiatives afoot that are attempting to force manufacturers and toy designers to put security first. In the UK, a “Code of Practise for Consumer IoT” was published in 2018. This sets out 13 steps that are required to address security issues in internet-connected consumer products, including removal of default password practices.

Organizations are beginning to offer certification around good privacy and security practices. The Me2B Alliance works in this area to create standards on respectful technology and certifications that demonstrate an organization's commitment to creating privacy-respectful technology.

An FTC-initiated working group is being run by the U.S. Commerce Department's National Telecommunications and Information Administration (NTIA). The group is working to develop guidance around securing IoT devices. In the EU, ENISA has produced guidance “Baseline Security Recommendations for IoT” developed for IoT devices within critical infrastructures but referencing smart toy security vulnerabilities.

Security is important to everyone and is a civil right, no matter what age you are. The design and development of internet-connected smart toys should be a priority to ensure the cyber safety of our children. Rushing out toys to take advantage of holidays like Christmas should not mean that security is an afterthought. We have a civic duty to ensure the safety and uphold the privacy of our children. 

 

Sources:

Snapshot: The rapid growth of the internet of things, Thales

Consumer IoT market — growth, trends, COVID-19 impact and forecasts (2021-2026), Mordor Intelligence

By 2023, size of industrial IoT market will grow $232.15 billion and CAGR 8.06%, Zion Market Research

Barbie wants to get to know your child, New York Times

What to look for when buying a smart toy, FTC 

Half a million IoT device passwords published, Bruce Schneier blog 

98% of IoT traffic unencrypted, IoTnow 

Kids, AI devices and intelligent toys, MIT

Code of practice for consumer IoT security, Department for Digital, Culture, Media and Sport: 

Me2B Alliance

Comment to National Telecommunications & Information Administration, ftc.gov

Artificial Intelligence and Children’s Rights, UNICEF 

Susan Morrow
Susan Morrow

Susan Morrow is a cybersecurity and digital identity expert with over 20 years of experience. Before moving into the tech sector, she was an analytical chemist working in environmental and pharmaceutical analysis. Currently, Susan is Head of R&D at UK-based Avoco Secure.

Susan’s expertise includes usability, accessibility and data privacy within a consumer digital transaction context. She was named a 2020 Most Influential Women in UK Tech by Computer Weekly and shortlisted by WeAreTechWomen as a Top 100 Women in Tech. Susan is on the advisory board of Surfshark and Think Digital Partners, and regularly writes on identity and security for CSO Online and Infosec Resources. Her mantra is to ensure human beings control technology, not the other way around.