This week’s post is aimed at dealing with the unavoidable uncertainty of processing private data. No matter how careful you are, you can always have a problem. No matter how much time and money you spend on precautions, you never have certainty that everything is secure. If you can’t guarantee perfection, then what will be considered good enough?
Earlier this week I attended a lively GDPR conference organized by a group called Wisdom of Crowds (link link). One of the most common sentiments that I heard from attendees was surprise at the large number of organizations that, with GDPR enforcement only 7 months away, have still done little or nothing to move toward compliance.
While I do not claim to have a comprehensive insight into this insouciance by large data controllers, I have the feeling that part of the reason is that, in many organizations, no individual will be significantly penalized personally in the case of a privacy-related problem. This is clearly the case with public-sector data controllers, given that in most Western countries public employees are protected from dismissal as long as they show up for work. In theory, they can be dismissed, but it usually requires a court case, and in fact very few such employees are dismissed.
Given that the public sector accounts for a large proportion of personal-data processing, the prospect of a bad audit report, a fine, or even a data breach is not likely to frighten a decision-maker. It’s one thing to assess a fine, for example, but who pays it? If the organization is your local government (e.g., link, link, link, link), then presumably you pay the fine through either higher taxes or decreased public services. In other words, you pay the fine for having your own data breached.
Consider the case of Sweden, whose transport authority exposed the personal data of its entire vehicle registry, including police, soldiers, and protected witnesses (link). What would the Swedish data-protection authority do under GDPR, fine the Swedish transport authority? (As it happened, the transport director, who had ordered the information to be sent to an outsource company, was fined 2 weeks’ pay.)
The situation is little different for private companies which cannot be allowed to fail for practical or political reasons, such as banks, utilities, transport, or even companies with thousands of employees. No such organization will be hit with anything approaching a crippling fine; after all, the host government would be forced to intervene.
As a practical matter, such a fine would never be attempted in the first place, and everyone who is in a position to make decisions about the company’s GDPR readiness knows it. An executive has much more to fear, for example, if she misses her revenue targets, or the promised software application is badly behind schedule or over budget than if there’s a privacy problem.
This effect was famously noted by John Maynard Keynes in his book, The General Theory of Employment, Interest, and Money, Keynes notes the following about portfolio managers, those who invest money on behalf of other people
Worldly wisdom teaches that it is better for reputation to fail conventionally than to succeed unconventionally.
In other words, it is better to fail in a crowd than to succeed in a novel way. Better for whom, you might ask; clearly not for the investor whose funds are being managed, but for the portfolio manager, who wants to keep her job. Your incentives depend on your position in the system. This is the essence of the problem; in most situations, we have official concerns (personal privacy, protecting my company) and personal concerns (keeping my job, getting a raise). (The primary exception is the self-employed entrepreneur, whose company’s fate is also the fate of her wealth.)
So it is as well for a large class of data controllers, who will not, often cannot, be penalized for decisions which make privacy problems more likely. Budgets are limited, headcount is limited, current systems were built without privacy constraints in mind. There is a deep well of excuses to draw from, and these excuses will be plausible enough to provide an effective defense.
What about those data controllers who are not explicitly or implicitly protected? Here I would include medium-sized private companies within the EU, and large organizations outside the EU (especially social media, cloud vendors, or data-driven advertisers such as Google) that are likely to be the first targets of fines and court cases. Any significant privacy sanction against a major data controller or processor will be a political decision as well as a regulatory one.
I don’t know where this leaves us, but I hope that that the EU has a plan, should next May come and go without much compliance in place.
My post for today takes the now-forgotten subject of software-process disciplines, in this case the Capability Maturity Model (CMM) (link) and proposes it as a sustainable, repeatable, and justifiable route to privacy compliance. The CMM’s method is a sharp contrast to the most common current methods, which generally come under the label of ‘Agile’ (link) which I have found, more often than not, serves as a nice-sounding synonym for ad-hoc development. Continue reading “A privacy maturity model, Part 1: Requirements”
It seems intuitively obvious that, without good security (encryption, access management, firewalls) private data is at risk. What is less obvious is that the reverse is also true; data leaks enable security breaches.
In spear-phishing, you are the phish
The Belgian DPA (hereafter referred as CPP, to reflect its official name, Commission for the Protection of Privacy) has issued GDPR-preparation recommendations in the form of 13 guidelines for companies processing personal data. The English-language summary below is taken from a law-firm website (link) (the originals were issued in French (link) and Dutch (link) only).
The good news is that the CPP has now set some priorities, giving data controllers and processors an idea of what the auditors will be looking for in the early days after the GDPR’s effective date. The bad news, at least as I see it, is not only is the guidance mostly vague, but also that there seems to be an embedded assumption that all of this is feasible within a short time frame. Continue reading “Belgian DPA recommends taking 13 steps now”
Today I want to draw attention to a possibility that I have not seen raised elsewhere, namely, that the Internet will turn out to be un-securable. If true, one might expect the un-secure Internet to bring an increase in damaging (even catastrophic) security and data breaches. Continue reading “What if the Internet cannot be secured?”
Suppose it’s your first day on the job as a new GDPR consultant at a company. So far most of the company hasn’t done any preparation, but you’ve met with the legal team and found that they’ve studied the legislation and guidance. How can they help you at this early stage?
For one thing, your legal team can help you define what kind of artefacts (IT slang for any tangible piece of evidence; it can be any sort of file, email, paper document, etc.) will be useful in demonstrating a good-faith effort to comply with the legislation. What can you create that will be useful for your legal team?
You might want to draw some hypothetical scenarios, such as: Continue reading “Building a history of compliance efforts”
Privacy invasion is now one of our biggest knowledge industries. Marshall McLuhan (1911-1980)
I would update McLuhan’s observation to say that privacy invasion is our biggest knowledge industry, period. The struggle to preserve some core of privacy will likely continue through the lives of all people now living, and will result in changes to every aspect of human life.
We who work at the forefront of privacy regulation have a front-row seat on this epic drama. We will see the early results, whether it is GDPR slowing the currently-uncontrolled traffic in data and creating a walled privacy garden within the EU, or whether the rising tide of data, like rising sea levels, simply swamps the defenses. Continue reading “Who says data privacy is a dull subject?”
The fallout from the Equifax case has already begun. Those of you who may be affected by this breach should read this article to guard against some of the scams that are already being thrown at data subjects:
For those not affected, the article is worth reading to get small sample of the variety of attacks that follow leaked data.
In other news, we find that a lawsuit against the U.S. government over a leak of some 22 million employees’ records has been dismissed (link) on the grounds that the aggrieved data subjects (here represented by their labor unions) have not proven that they were harmed by the breach.
In a normal case of injury or damage, proof of cause and effect is a normal requirement. But how could the plaintiffs possible prove that damaging effects are caused by this particular breach? After all, the government could point out that a given harm was possibly caused by the Equifax breach, or some other. In other words, the traditional rules of evidence and inference are of little use in the age of internet crime.
Stolen data is completely anonymous. When you are attacked, you have no idea where the attacker obtained his data; even the attacker himself may not know the seller’s identity. The data in question might even have been legally obtained; many businesses (such as Equifax) are in the business of collecting and selling data.
Unlike money, data does not need to be laundered; it’s already untraceable. Unlike money, it can be copied and distributed without limit. Exposing one’s personal data on the internet is like exposing one’s body to radiation: the effects are cumulative, and for life.
At the moment the technical, legal, political, and other capabilities are not in place to stop this trend, nor even to slow it down by very much. Our technology is running ahead of our ability to deal with its consequences.
In this post I will discuss an Oracle presentation on how its product line provides the technical means for GDPR compliance (link). Although I am impressed with the presentation, my main reason for introducing it here is to be able to refer to it when discussing different stages of GDPR measures. Simply reading through this document will give you an idea of the range of technical measures necessary for good compliance at the database level.
In particular, I like the 3-page Appendix at the end of the document, which lists various GDPR articles and the Oracle feature that helps you to comply with each one. If you’re considering a packaged solution, your vendor should be able to present a similar mapping of GDPR requirements to product features. Continue reading “Considering an end-to-end GDPR solution: Oracle”