A notice to our customers: Microsoft are experiencing disruption related to several of their M365 services that could impact your system. We are carefully monitoring the situation. You can be kept updated via Microsoft's service page.
×

GDPR for Software Designers and Developers

07 October 2017 By Frans Lytzen (NewOrbit), Simon Halberstam, Raoul Lumb & Anne Rose (Simons Muirhead & Burton)

Before you read this post, it is advisable to read the GDPR Summary.

This post covers subjects from the GDPR which should be of particular interest to organisations who design and build software, so it is focused primarily on software functionality. In practice, many of the decisions about GDPR will be made in the broader business context, but this post should help you to ask the product owner at least some of the right questions.

We have divided this into three areas;

  • Gathering Data
  • Rights of Individuals
  • Storing and Processing of Data

Gathering data

Consent to data processing is a key part of the GDPR and one that will require changes to a lot of software. and websites. It is important to understand that processing is at the heart of these regulations, rather than just data. You can't really ask for consent to store some data; you need to explain to the user what you are going to do with their data and get their consent to this processing.

Consent must be freely given and must be informed;

  • You need to make it very clear to the user what you are going to do with their data.
  • The user has to take clear action to give consent, it mustn't be given by default, so you mustn't pre-tick the consent tick box.
  • You don't need to ask for separate consent to "perform your service under the contract". For example, if you have a shopping site and someone places an order, you probably don't need to ask consent to store their address (i.e.: so you can send them the bicycle they ordered from you).
  • You do need to ask consent to do anything else. So, if you want to send the bicycle customer a special offer next month, you must generally ask for consent to do that.
  • You must not make the provision of your service dependent on consent to capture and store personal data not necessary for the execution of the transaction. For example, imagine you developed a free service to send people reminder emails for tasks. You do not need to ask for consent to store their email address so you can send them the reminder; that is core to the service (as long as the user has to take a clear action to ask for the reminder to be sent in the first place). You may also ask them for consent to send them other emails. However, if the user does not give you that consent, you are not allowed to then say you won't give them the free service. For some business models, this may be a significant challenge.

It must be as easy to withdraw consent as it was to give it in the first place. So, no more requiring the user to call a phone number between 9am and 9:15am on Tuesdays. If they gave consent by ticking a box on your site, they need to be able to withdraw consent by un-ticking a box or similarly easy process.

For some types of software, this is pretty easy; just add the consent boxes to the My Account screen. But, for other types of software this may cause more headaches;

  • Some systems have transient users. For example, you may have a shopping site where users don't need to create an account or you run a survey website where users fill in a survey after clicking a link in an email. In those cases it may be tricky to build a process flow where it is easy for the user to withdraw consent.
  • Many systems will have been built on the assumption that the consent was given once and then all subsequent downstream processing can carry on in perpetuity. You now need to keep track of whether the consent is still present each time you do the processing.
  • If you pass the data on to anyone else, you need to tell them that the user has withdrawn consent. For example, you may have a consent option to let the user agree to be contacted to buy insurance for the bicycle you just sold them. You pass that on to an insurance company – but then the user changes their mind and withdraws the consent and you now have to tell the insurance company that it is no longer allowed to call the user to sell insurance.

See Articles 4, 6 and 7 and Recitals 32, 42 and 43 for more information.

One more thing… when you ask for Consent you need to inform the user not only what you are going to do with their data, but also about their right to withdraw consent, to have their details erased, to request a copy of their data and so forth. See article 13. Hopefully a standard phrasing will emerge that we can all use.

Collect and store the minimum amount of data

This isn't really new, but it is more important under the GDPR than ever.

You should only collect the minimum amount of data you need; you are not allowed speculatively to gather data "in case you need it".

See also the notes about data retention further down.

See Article 5.1.b.

Children

In short, if you store or process data about anyone under the age of 16 (may be reduced to 13 in the UK), stop reading and call a lawyer now, unless you have already sought advice and implemented the recommendations.

Of particular impact for software design, consent must be given or authorised by the the holder of parental responsibility over the child as per Article 7. It is unclear how you can or should do that in software, though guidance may eventually emerge.

Rights of Individuals

The GDPR is centred on protecting the "fundamental rights and freedoms of individuals". In addition to all the obligations placed on how you process their data, they also have certain rights.

As mentioned above, a Data Subject has the right to withdraw their consent at any time and it must be as easy to withdraw consent as it was to give it.

Subject Access Request (SAR)

In summary;

  • SARs are now free, so expect many more of them.
  • You are required to provide the response in a common, electronic format.
  • The GDPR strongly encourages an online portal through which users can request their data, though it is not required.

People have always been able to approach you to ask for a copy of all the data you hold about them. Under the old rules, you were allowed to charge £10 per request, which is likely to have significantly reduced the number of requests you received. Under the GDPR, you are not allowed to charge (at least not for the first request) so it is likely that you will see many more requests, which you need to handle. This small change in the law has the potential to produce one of the most disruptive effects on some types of software as the volume of SARs has the potential to soar.

There is very little to stop this being used vexatiously, for example by a community group deciding to get all members to ask for a SAR at the same time from a particular organisation. There are some protections in the regulation, but it is unclear how far they go and the burden of proof is on you. It is doubtful you will get much sympathy from the regulator for not being able to handle a high volume of SARs.

In practice, you will want to automate SARs by building an online portal for users to request their data and automate the process of providing all the data back to the user in an electronic format they will be able to read. As an example, see how Facebook currently do it: https://www.facebook.com/help/contact/180237885820953

See Recital 59 and 63 as well as Article 12 and 15.

Erasure

A person has the right to ask you to delete all their data. There are very limited circumstances where you can refuse such a request, including;

  • It necessary for rights of freedom of expression or information;
  • It is necessary for compliance with a legal obligation under Union or Member State law;
  • It is in the public interest or carried out by an official authority;
  • for public interest in the area of public health;
  • for archiving or research; or
  • for legal claims.

But, as a software designer you need to think about how you will handle it when you are asked to completely delete someone's data from your system.

One thing you should consider is that completely anonymised data is not considered personal data, so if you can completely anonymise someone's data then you may be able to call that erasure. But, this is more complicated than it sounds so you should definitely seek advice if you consider that approach. Do refer to the section below that looks at the distinction between anonymised and pseudonymised data.

See Recital 65 and Article 17.

Right to challenge automated decisions

It is increasingly common for software systems to "profile" people and to automate decision making, such as credit checks etc. The GDPR gives individuals to challenge any such automated decision and request it be reviewed by a human being. This appear to be aimed primarily at machine learning approaches where decisions cannot be explained.

Please note that these rules are very similar to those in the Data Protection Directive. While earlier drafts of the Regulation promised a much broader clamp down on profiling, the final position is much more moderate. For example, in many cases, profiling for marketing purposes will fall outside this restriction as it is unlikely to have legal effects or significantly affect the individual. Individuals may however have a right to object to profiling for direct marketing.

Correction and restriction

In most systems, you will probably just deal with withdrawal of consent and erasure. However, some types of systems will need to consider another protection which allows someone to demand that you restrict__processing of their data whilst you resolve a dispute with them about the data you hold about them. This could, for example, apply to Credit Referencing Agencies where an individual is unhappy with the data about them, wants to stay on file but don't want any credit checks etc done whilst they argue with you about the accuracy of the data you have on them.

Storing and Processing Data

Encryption

The only technology that is specifically mentioned in the GDPR is Encryption. You are strongly encouraged to encrypt all data, at rest and in transit. This includes databases, documents and so forth. Some of this will be an Operations issue, but some of it is likely to impact the software itself.

One of the ways you are encouraged to encrypt is that if data is stolen, you may be exempt from the need to notify the ICO if it is encrypted. The detailed rules can be complex and certain sectors have specific rules, including provisions for infrastructure providers and obligations under the Network and Information Security Directive etc. The key point is that encryption is encouraged and should be a key part of your defence strategy.

But let's just think about what that means in practice. Your Operations team can encrypt the database and files on disk, which provides protection from someone stealing the harddrive. But, if someone has legitimate access to the database itself (say, your support team), they will be able to take a copy of the data in an unencrypted state. Depending on your setup and your risk profile, you may need to consider encryption at the application layer for some data. This is obviously a big and complicated area with trade-offs, so you need to assess your specific situation.

If you use SQL, be aware of the Always Encrypted feature which uses some very clever technology to do application layer encryption in a nearly transparent manner, especially on Azure. If you are on Azure you may also want to look at Key Vault's encryption tools to encrypt data you store in blob storage or, indeed, implement your own.

Retention

The GDPR further tightens the rules that mandate that you should only keep data as long as you need it. The burden is on you to show why are you keeping data and, if you haven't got a good reason, you need to delete it.

When designing software, you should think about how you are going to handle the deletion of data and how you are going to automate that process.

Pseudonymisation and anonymization

The GDPR introduces the phrase Pseudonymisation which seems to mean that you make the data harder to relate to an actual person by removing or obscuring some of their personal identifiers.

It is not the same as anonymization though. For something to be considered anonymised a determined person should not be able to reasonably identify the real person that the data concerns, even when combined with other data sources.

As an example, imagine you store data about employees and remove names and addresses but keep their job title, department or job grade. You may have some departments, jobs or grades with only one or two people in them, so a determined person would be able to find the real person, even if it means pulling in data from another source.

Similarly, if you store twitter handles or even IP addresses then a determined person might be able to identify the real person behind it, so it's not anonymised.

Truly anonymised data is not personal data and is therefore not covered by the GDPR.

Pseudonymisation is a way to reduce the risk to the fundamental rights and freedoms of people, but it doesn't make the data exempt from the GDPR.

You should pseudonymise data as a matter of course to reduce the risk to the people about whom you store data. You should still delete that data or fully anonymise it as soon as you can.

See Recitals 26 and 30.

Particularly sensitive data

Just as in the old Data Protection Act, the GDPR lists a number of specific data items that require significantly more justification and safety to store and process and the new list is wider;

Processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person's sex life or sexual orientation shall be prohibited.

(Article 9).

The article goes on to list when this is not prohibited; The important thing is that it is prohibited by default, so you must not collect or use it unless you have approved justification.

If you store or process any data falling under those categories, you should seek specialist advice. It is also worth noting that Recitals 51 and 75 broaden this somewhat to be much more risk based. The Recitals are not part of the law, but it is probably not a good idea to read the list in article 9 as an absolute list of sensitive data and assume that everything else is automatically fine.