Introduction

Both the terms ‘Internet of Things’ (IoT) and ‘Privacy by Design’ (PbD) were coined back in the 90s. The original idea behind PbD is to weave privacy into the very fabric of IT systems, networked infrastructure, business processes and design specifications; for that to happen successfully in the context of IoT, manufacturers of Internet-connected devices need to build privacy into their products from the ground up and at the outset of the developing process. In essence, the PbD is based on adherence to the 7 Foundational Principles of Privacy by Design:

Dr. Ann Cavoukian – the founder of the Privacy by Design concept – explained in a 2016 report that “by embedding or coding privacy preferences into the technology itself, in order to prevent the privacy harms from arising,” the PbD will achieve its goal to protect personal data and privacy at all stages of a product’s development process.

Nowadays, IoT is on the verge of becoming ubiquitous. San Jose, California has plans to create a smart city that will use transit vehicles and an infrastructure full of smart sensor appliances and technology with the ultimate goals of improving of safety, mobility and optimization of the transit system. The creators of this project claim it will deliver the “smart city” experience in a safest and most user-friendly way. Do they plan, however, to achieve that through the PbD approach?

Cyberattacks against smart infrastructure do not remain in the sphere of science fiction; on the contrary – there have already been cases of compromised cameras, printers, weighing scales, doorbells, home routers and even connected fish tanks. Two examples of IoT products that have well-documented security issues – the lack of encryption and weak authentication mechanisms – are D-Link cameras and TP-link Smart Plugs.

Due to the boom of smart technology, the attack vectors continue to increase at a rapid rate. This is a clear illustration of the old maxim: when everything is connected, the network is only as strong as its weakest link. For IoT devices to be secure, one should protect hardware, software and connectivity. Unfortunately, most of the smart objects are not designed with strong (or any) security features built into their system. Limitations of IoT devices – e.g. insufficient processing power, memory and battery storage – constrain their capabilities to process information at a higher rate.

Without serious consideration of the important matters of privacy and security of connected objects, there will be more botnets and more security breaches. Fortunately, the security firms seem to understand the gravity of the problems in question, as Gartner envisages worldwide spending on IoT security hardware, software, and services to reach $3.1 billion in 2021 (to make comparison: this figure is $1 billion in 2018).

Legal Structure of the Notion of PbD

With its Article 25 titled “Data protection by design and by default,” the European Union’s General Data Protection Regulation (the GDPR) adopted this notion officially, thus transforming it from a recommended best practice into a mandatory rule. Not only the EU was eager to mandate PbD – California Senate Bill 327 was introduced in the California Senate in April 2017, pursuant to which Web-connected devices should have built-security features appropriate to the nature of the device and the information it collects, contains or transmits.

Unfortunately, as mentioned before, privacy and security are very poorly implemented in IoT product and service development, and these matters are handled often as an afterthought. In 2017, the Cyber Shield Act was introduced in the U.S. to remedy this problem.

Let’s go back to the law that actually applies the said approach at the moment. Art. 25 of the GDPR makes mention of which methods data controllers/processors may choose to use in order to apply the PbD approach: “pseudonymization,” “data minimization” and other “necessary safeguards [that] protect the rights of data subjects.” Nevertheless, as the line “Such measures could consist, inter alia, of…” (Recital 78) suggests, this list is not, by all means, exhaustive.

Best Practices Related to Embedding PbD into IoT

Pseudonymization

It is important to be noted that anonymous data does not fall within the scope of the GDPR; hence, if you are able to completely remove all identifiers from personal data, it will not be deemed personal any more within the meaning of the EU data protection law.

Although anonymity is often preferable, it is not always practical, because it runs counter to the principle of accountability. Pseudonymization, however, could bring about means that will strike a balance between anonymity and accountability. Pseudonymization utilizes a random identifier that secretly links to a person instead of a person’s identity. A criminal would be incapable of directly identifying a data subject without linking the pseudonymization data to other sets of data stored and protected separately. In essence, this technique gives organizations the freedom to continue to process personal data under certain circumstances, as it protects individuals’ right to privacy rather well.

The 7 De-Identification Techniques of WP29

  1. Noise Addition: identifiers are expressed imprecisely (i.e., weight is expressed inaccurately +/- 10 kg).
  2. Substitution/Permutation: identifiers are shuffled within a table or replaced with random values (i.e.a specific blood type is replaced with “Magenta”).
  3. Differential Privacy: identifiers of one data set are compared against an anonymized data set held by a third party with instructions of the noise function and acceptable amount of data leakage
  4. Aggregation/K-Anonymity: identifiers are generalized into a range or group (i.e. age 43 is generalized to age group 40-55)
  5. L-Diversity: identifiers are first generalized, then each attribute within an equivalence class is made to occur at least “L” times. (i.e. properties are assigned to personal identifiers, and each property is made to occur with a dataset, or partition, a minimum number of times).
  6. Pseudonymization – Hash Functions: Identifiers of any size are replaced with artificial codes of a fixed size (i.e. blood type 0+ is replaced with “01”, blood type A- with “02”, blood type A+ is replaced with “03” etc).
  7. Pseudonymization – Tokenization: identifiers are replaced with a non-sensitive identifier that traces back to the original data, but are not mathematically derived from the original data (i.e. a credit card number is exchanged in a token vault with a randomly generated token number).

Source:Pseudonymization and De-identification Techniques” by Anna Pouliou

Perhaps the biggest obstacle to effective pseudonymization is the difference in standards and proprietary technology of smart apparatuses.

Data Minimization

Data minimization is an essential element of the PbD concept, which requires services/applications in the realm of the IoT technology to process only the minimum amount of information necessary for the fulfilment of the particular service/function/transaction. The principle of data minimization can reduce both the size of the information IoT devices collect/process and the data retention period. Presumably, this would also reduce the chances of data handling issues (such as any kind of data misuse) or information theft.

Privacy enhancing technologies (PETs) that apply this principle can be designed not collect or store any personal information/personal identifiers (e.g. search history, search terms, IP addresses). Ixquick (now StartPage), Unbubble, Disconnect and DuckDuckGo are excellent examples of such PETs. All tools that erase digital footprints – web browser cache, cookies, browsing history, address bar history, typed URLs, autocomplete form history, saved passwords, search history, recent documents, temporary files, recycle bin and more – may be used to achieve the same effect in the context of Internet-enabled devices.

In summary, to apply the data minimization principle, IoT companies must:

  1. Collect only the fields of data necessary to the product or service being offered
  2. Collect as little sensitive data as possible

Other Necessary Safeguards and Best Practices

The 2017 Verizon Data Breach Investigations Report, attested to the fact that 81% of hacking-related breaches happened due to weak passwords and 43% involved phishing – both of which are attacks that exploit the human factor. In addition, IT administrators are often failing to maintain best practices with respect to the IT infrastructure of which they are in charge.

Perhaps the only feasible solution to correct such negligent behavior is to embrace privacy right from the outset. Manufacturers of connected products need to consider privacy at all times if their products process personal data. Usually, it is difficult and expensive to add privacy to a product at a later stage or, even worse, reengineer it following a failure. Widespread vulnerabilities like Heartbleed and Shellshock continue to plague IoT products. For that reason, it is essential to plan for future upgrades to device software. Unfortunately, many smart products are unpatchable.

Photo by Ioliver2 / CC BY

PbD embedded into connected objects also includes the presence of cyber-hygiene best practices, such as:

  • Security transmission protocols and encryption techniques for data in transit and at rest. Protocols such as HTTPS and SSH are created to support encryption and strong authentication; unfortunately, the majority of IoT objects today can’t use these features due to various inherent technical constraints.
  • Proper authentication controls, limiting permissions (assigned on a need-to-know basis). Having usernames and passwords for every device is simply not feasible in an industrial environment. Alternative mechanisms, such as blockchain, could solve the problem of trust and identity between smart objects. Whitelisting of IoT clients may also prove useful in these situations .

    For critical communications, especially those that convey sensitive data, authentication and encryption measures are imperative for optimal protection, but providing a checksum or signature to allow the integrity of the data to be verified can be a recommended best practice with additional value to privacy and security.

  • Options to allow privacy/security default settings to be changeable. This should include even hazardous services with a proven track record of creating vulnerable environments. For instance, many devices come from the factory equipped with non-essential services – Telnet or FTP, among others – that also pose high risk to users.
  • Training company staff in privacy and data security best practices.
  • Application containerization, where apps are installed in a contained environment (akin to virtual machines), could be beneficial privacy- and security-wise as well.

Some practical approaches that may facilitate the implementation of the PbD idea into a real and workable privacy shield are backend isolation, data separation, segregation, redaction and data transform techniques that remove personally identifiable information. It is advisable IoT products to have a button to switch off the “connectivity” function so that consumers can use them as regular products (e.g., from a connected plug into a regular one).

IoT products should undergo vigorous standard security testing, such as code analysis and ethical hacking, but also testing that specifically targets the effectiveness of the privacy-enhancing mechanisms. Data controllers need to vet data processors, vendors and other parties to know whether their cyber hygiene best practices live up to their expectations. Probably the most famous case of such a cyberattack was the one against Target, where the malicious actors gained control over the HVAC system of the company supplying Target.

When developers take into consideration the development of a product at the earliest stages, they should perform a thorough risk assessment and full analysis of potential attack vectors. The key to implementing PbD is the Data Protection Impact Assessment (DPIA). A DPIA is a process that evaluates the risks associated with processing of particular personal data when it “is likely to result in a high risk to the rights and freedoms of natural persons.” (Art. 35 of the GDPR.) Although a DPIA is required only for companies categorized as high-risk, it is an integral part of the PbD approach. It should be carried out “prior to the processing in order … taking into account the nature, scope, context and purposes of the processing and the sources of the risk.” (Recital 90)

There is a U.S. equivalent of the DPIA called cybersecurity disclosures, which are required by the Securities and Exchange Commission (SEC). In short, companies are obligated to discuss information security risks and incidents.

Furthermore, the FTC is an advocate of the risk-based approach, which should take root from early stages through means of drafting a full inventory of the type and variety of personal information collected and subsequently understanding of data flows throughout the entire life cycle of all data sets. With respect to this point, the FTC noted:

“An evolving inventory serves triple duty: It offers a baseline as your staff and product line change over time. It can come in handy for regulatory compliance. And it can help you allocate your data security resources where they are needed most.”

To fully realize the potential of the PbD idea, enterprises should collect, map the flow, and analyze the data they handle.

PbD is, in fact, not only a responsibility of developers or similar people closely engaged with the manufacturing process. Everyone within the organization, from the software engineers to the marketing teams that make use of the applications, should be committed to the PbD.

You know the motto: “Security is everyone’s responsibility.” Privacy and security go hand in hand, so “security by design” could complement the value of the PbD. As Isabelle Noblanc wrote at PYMNTS.com: “Security isn’t the hot sauce you add on the side. It’s a key ingredient to any system, and it’s something IoT manufacturers need to think about from the very beginning.”

Ethical Hacking Training – Resources (InfoSec)

Transparency + Fairness + User Control = Trust

Photo by Gerd Leonhard / CC BY

The notion of PbD alone may not be enough to promote privacy without a working regimen on how service providers are to obtain consumers’ meaningful consent. Terms of service should be designed to prevent service providers from using the personal data of their customers, unless opt-in consent has been obtained in advance. It is important for IoT companies, as far as privacy is concerned, to vest their users with the power to control their personal data, and that is usually done based on the principle of “transparency.”


Photo by Ann Wuyts / CC BY

An ESET research team investigated privacy concerns associated with some popular IoT products. According to their findings, voice-activated intelligent assistants seemed to raise most concerns, since there is a greater probability for interception of digital traffic by cybercriminals, oversharing of data among service providers may not be uncommon, and the overall state of data protection is not up to par.

Smart technology brings about convenience, but the price the users pay is in their personal data, which is mined, analyzed and sold. Unfortunately, many companies build their businesses around data mining and analysis, and are therefore rather reluctant to adopt PbD practices. But at least two benefits are begining to arise through the implementation of PbD: the user is assured strong privacy and control over their own information, and organizations gain competitive advantage.

Trust is quickly becoming an important asset in the digital ecosystem. It has become a form of currency, as wary customers are now on the lookout for companies who have demonstrated a commitment to maintaining security and privacy.

Perhaps PbD can be the cornerstone on which IoT companies build their trust relationships with their clients.

Sources

Casino Gets Hacked Through Its Internet-Connected Fish Tank Thermometer, The Hacker News

D-Link vulnerability impacts 400,000 devices, CSO Online

Information disclosure vulnerability in TP-Link Easy Smart switches, Chris’s Security and Tech Blog

Mirai IoT Botnet Co-Authors Plead Guilty, Krebs on Security

Gartner Says Worldwide IoT Security Spending Will Reach $1.5 Billion in 2018, Gartner

California Bill Mandates Privacy by Design for IoT Devices, Lexology

Sticker shock? The Cyber Shield Act of 2017 attempts to make IoT manufacturers prioritize IoT security, Reed Smith

Heartbleed and Shellshock thriving in Docker community, ComputerWeekly

Target attack shows danger of remotely accessible HVAC systems, ComputerWorld

SEC Releases Updated Guidance for Cybersecurity Disclosure, Security Intelligence

Why Privacy Must Be Baked Into IoT Devices, PYMNTS

Privacy by Design: Can you create a safe smart home?, WeLiveSecurity

The internet of things and GDPR, GDPR:Report

Planning for privacy by design, Deloitte

Data privacy by design, Lexology

How GDPR Impacts US Cyber Security Policy, IT Security Central

Privacy by Default: A Privacy and Cyber-security imperative in the IoT and Big-Data Age, IP Osgoode

Privacy by Design Is Important For Every Area Of Your Business, Forbes

The Importance of Security by Design for IoT Devices, Red Alert Labs

Securing Data Through GDPR’s Privacy by Design, Trend Micro

Privacy Risk Summit Preview: Privacy by Design for IoT, TrustArc Blog

Privacy and Security by design is a crucial step for privacy protection., Least Authority

Kingsmill, S. & Cavoukian, A. Privacy by Design: Setting a new standard for privacy certification

Maple, C., Security and privacy in the internet of things, Taylor and Francis Online