Step 9 of the Belgian Privacy Commission’s guide to getting started with GDPR compliance concerns detecting, analyzing, and dealing with the fall-out of a data breach. The recent recall and re-issue of Estonia’s smartcard IDs brought home to me that public relations (PR) planning is an essential part of breach preparation, not to protect the public’s privacy (PR can’t do anything to remedy a breach), but to mitigate the reputational damage to the firm. Continue reading “Data-breach preparation: don’t forget the PR”
This week’s post is aimed at dealing with the unavoidable uncertainty of processing private data. No matter how careful you are, you can always have a problem. No matter how much time and money you spend on precautions, you never have certainty that everything is secure. If you can’t guarantee perfection, then what will be considered good enough?
Earlier this week I attended a lively GDPR conference organized by a group called Wisdom of Crowds (link link). One of the most common sentiments that I heard from attendees was surprise at the large number of organizations that, with GDPR enforcement only 7 months away, have still done little or nothing to move toward compliance.
While I do not claim to have a comprehensive insight into this insouciance by large data controllers, I have the feeling that part of the reason is that, in many organizations, no individual will be significantly penalized personally in the case of a privacy-related problem. This is clearly the case with public-sector data controllers, given that in most Western countries public employees are protected from dismissal as long as they show up for work. In theory, they can be dismissed, but it usually requires a court case, and in fact very few such employees are dismissed.
Given that the public sector accounts for a large proportion of personal-data processing, the prospect of a bad audit report, a fine, or even a data breach is not likely to frighten a decision-maker. It’s one thing to assess a fine, for example, but who pays it? If the organization is your local government (e.g., link, link, link, link), then presumably you pay the fine through either higher taxes or decreased public services. In other words, you pay the fine for having your own data breached.
Consider the case of Sweden, whose transport authority exposed the personal data of its entire vehicle registry, including police, soldiers, and protected witnesses (link). What would the Swedish data-protection authority do under GDPR, fine the Swedish transport authority? (As it happened, the transport director, who had ordered the information to be sent to an outsource company, was fined 2 weeks’ pay.)
The situation is little different for private companies which cannot be allowed to fail for practical or political reasons, such as banks, utilities, transport, or even companies with thousands of employees. No such organization will be hit with anything approaching a crippling fine; after all, the host government would be forced to intervene.
As a practical matter, such a fine would never be attempted in the first place, and everyone who is in a position to make decisions about the company’s GDPR readiness knows it. An executive has much more to fear, for example, if she misses her revenue targets, or the promised software application is badly behind schedule or over budget than if there’s a privacy problem.
This effect was famously noted by John Maynard Keynes in his book, The General Theory of Employment, Interest, and Money, Keynes notes the following about portfolio managers, those who invest money on behalf of other people
Worldly wisdom teaches that it is better for reputation to fail conventionally than to succeed unconventionally.
In other words, it is better to fail in a crowd than to succeed in a novel way. Better for whom, you might ask; clearly not for the investor whose funds are being managed, but for the portfolio manager, who wants to keep her job. Your incentives depend on your position in the system. This is the essence of the problem; in most situations, we have official concerns (personal privacy, protecting my company) and personal concerns (keeping my job, getting a raise). (The primary exception is the self-employed entrepreneur, whose company’s fate is also the fate of her wealth.)
So it is as well for a large class of data controllers, who will not, often cannot, be penalized for decisions which make privacy problems more likely. Budgets are limited, headcount is limited, current systems were built without privacy constraints in mind. There is a deep well of excuses to draw from, and these excuses will be plausible enough to provide an effective defense.
What about those data controllers who are not explicitly or implicitly protected? Here I would include medium-sized private companies within the EU, and large organizations outside the EU (especially social media, cloud vendors, or data-driven advertisers such as Google) that are likely to be the first targets of fines and court cases. Any significant privacy sanction against a major data controller or processor will be a political decision as well as a regulatory one.
I don’t know where this leaves us, but I hope that that the EU has a plan, should next May come and go without much compliance in place.
My post for today takes the now-forgotten subject of software-process disciplines, in this case the Capability Maturity Model (CMM) (link) and proposes it as a sustainable, repeatable, and justifiable route to privacy compliance. The CMM’s method is a sharp contrast to the most common current methods, which generally come under the label of ‘Agile’ (link) which I have found, more often than not, serves as a nice-sounding synonym for ad-hoc development. Continue reading “A privacy maturity model, Part 1: Requirements”
It seems intuitively obvious that, without good security (encryption, access management, firewalls) private data is at risk. What is less obvious is that the reverse is also true; data leaks enable security breaches.
In spear-phishing, you are the phish
The Belgian DPA (hereafter referred as CPP, to reflect its official name, Commission for the Protection of Privacy) has issued GDPR-preparation recommendations in the form of 13 guidelines for companies processing personal data. The English-language summary below is taken from a law-firm website (link) (the originals were issued in French (link) and Dutch (link) only).
The good news is that the CPP has now set some priorities, giving data controllers and processors an idea of what the auditors will be looking for in the early days after the GDPR’s effective date. The bad news, at least as I see it, is not only is the guidance mostly vague, but also that there seems to be an embedded assumption that all of this is feasible within a short time frame. Continue reading “Belgian DPA recommends taking 13 steps now”
Today I want to draw attention to a possibility that I have not seen raised elsewhere, namely, that the Internet will turn out to be un-securable. If true, one might expect the un-secure Internet to bring an increase in damaging (even catastrophic) security and data breaches. Continue reading “What if the Internet cannot be secured?”
Suppose it’s your first day on the job as a new GDPR consultant at a company. So far most of the company hasn’t done any preparation, but you’ve met with the legal team and found that they’ve studied the legislation and guidance. How can they help you at this early stage?
For one thing, your legal team can help you define what kind of artefacts (IT slang for any tangible piece of evidence; it can be any sort of file, email, paper document, etc.) will be useful in demonstrating a good-faith effort to comply with the legislation. What can you create that will be useful for your legal team?
You might want to draw some hypothetical scenarios, such as: Continue reading “Building a history of compliance efforts”
Privacy invasion is now one of our biggest knowledge industries. Marshall McLuhan (1911-1980)
I would update McLuhan’s observation to say that privacy invasion is our biggest knowledge industry, period. The struggle to preserve some core of privacy will likely continue through the lives of all people now living, and will result in changes to every aspect of human life.
We who work at the forefront of privacy regulation have a front-row seat on this epic drama. We will see the early results, whether it is GDPR slowing the currently-uncontrolled traffic in data and creating a walled privacy garden within the EU, or whether the rising tide of data, like rising sea levels, simply swamps the defenses. Continue reading “Who says data privacy is a dull subject?”
As I have mentioned in passing, it is my firm belief that an essential part of GDPR compliance is documentation. If your documentation is sketchy, out-of-date, or vague (consisting, say, of emails and slideshows), how will you show that you have privacy by design, and that you enforce and verify this requirement at every stage in the development process?
Once you’ve created all these documents, how will you find what you’re looking for? Have you ever wasted time trying to find information among hundreds of files in a shared folder or Sharepoint? If so, you already know that these tools have limited ability to search inside of documents. Continue reading “Searching documents with Oracle Text, part 1”