A pregnant 16 year-old girl, carefully hiding that fact from her parents, came home one night to an irate father—her pregnancy was no longer a secret. Who revealed the secret she had worked so hard to keep? It turns out that Target, the retailer, was the culprit. Some month’s earlier Target's data scientist was asked to find who among their consumers were in their second trimester of pregnancy. Using modern data analytics methods on customer purchase records enabled him to produce just such a list. Using this list the company launched a campaign to attract these women to their stores. Unfortunately for this girl the marketing piece intended for her, was discovered by her father.
When this story broke it was not only embarrassing for Target, but resulted in significant damage to their brand. This is one such example of a growing number of reported breaches of public trust and the increasing frequency and severity of these incidents has made privacy as a hot topic of discussion for executive management and a new responsibility for the CIO.
The exponential growth of projects involving big data, customer profiling, and data monetization plans involve collecting and processing ever increasing amounts of customer data. This data is then copied between systems, mixed, analyzed, mined, chopped, and shaped by staff all across the organization. Through these processes organizations are deriving increasing value from data and tying their future success ever more tightly to it.
At the same time consumers are increasingly concerned with protection and appropriate uses of their personal data. You can see evidence of this in the steady drumbeat of news from major outlets covering this issue; you can see it in the public uproar over each new corporate breach of public trust that comes to light, and you clearly saw it in the hypersensitive response to each Snowden revelation as that story unfolded. The level of attention to—and the intensity of public consciousness over— data protection issues is unprecedented.
The combination of the increasing value of data coupled with rising sensitivity among the public to uses of their data is exponentially raising the risk of a privacy incident. These incidents run the risk of regulatory action and undermine the organization's reputation. As a result demand for accountability in respect to privacy protection is growing and organizations are struggling to find someone to take ownership of this issue.
Our governance, risk and compliance functions (GRC) comprised of legal, audit, compliance, risk, IT and security functions, each work their own responsibilities in symbiosis with their colleagues to ensure the organization stays on the straight and narrow.
The combination of the increasing value of data coupled with rising sensitivity among the public to uses of their data is exponentially raising the risk of a privacy incident
However, with the advent of breaches of trust, our traditional GRC architecture doesn't identify a single entity to mitigate these new risks.
Incidents like Target's outing of the pregnancy of a 16 year-old girl to her parents is a clear example of a breach of trust that falls between traditional GRC functions. More recently Facebook published the results of a mood experiment conducted by their data scientists. The scientists set out to prove that the tone of the newsfeed read by Facebook users (either predominantly positive or predominantly negative) would have a contagion effect on the mood of the reader. They conducted an experiment manipulating the tone of newsfeed on more than three-quarters of a million Facebook users, and found that in fact the editorial tone of the newsfeed affects user’s emotional state.
Not surprisingly people were upset when they learned about this experiment–most people do not like their web services manipulating their emotions. Again, like the Target inci¬dent, this was a significant public em¬barrassment.
In each of these cases our traditional GRC functions don't have a clear point of accountability for mitigating the risk of this happening in their organization and therefore at most organizations executive management is looking for someone to accept this new responsibility.
Unwittingly the CIO has raised their hand. For decades the CIO has fought for the authority to be the center of all data flows in the organization arguing that it is necessary to accomplish their mission. Now, in the wake of the rash of information security breaches and its associated costs from Target, Home Depot, Michael's, JP Morgan, Anthem and many others, the argument is over and the CIO has been granted this authority. Today at most organizations the expectation from executive management is that all data flows go through the functions reporting to the CIO.
With this authority comes the responsibility to ensure that not only is data protected, but is used appropriately and ethically–the heart of privacy risk mitigation. This is requiring the CIO to sure up efforts already under way as well as acquire a new set of skills and resources.
While often neglected, proper privacy risk mitigation, as well as most compliance efforts, requires a comprehensive understanding of the data landscape at the organization. This is a first and fundamental step as getting a handle on those efforts is critical, and only with a clear picture of the data landscape can the real privacy risk mitigation work begin.
This work starts with inculcating a culture of privacy throughout all levels of the organization. This is no small task but an effort spearheaded by the CIO to the employees under her purview will have the greatest impact. Engineers, security professionals, IT administrators, and other staff under the CIO can all serve as effective issue spotters and flag raisers. These individuals know where data is collected, where it is stored, how it is stored, who has access to it, and for what purposes and for that reason these same individuals, if properly trained and motivated, can ask the questions that can nip potential issues in the bud.
Did these pregnant women tell us they are pregnant? Did these users opt-in to the experiment? Why are we collecting precise location data? This cultural shift takes time and effort and therefore it is critical that the CIO recognizes and accepts their new responsibilities sooner rather than later. To raise the necessary army of issue spotters and flag raisers, and to inculcate a culture of privacy, the modern CIO needs to lead training and education efforts, such as those offered by the IAPP, across their staff and the wider organization. Those that don’t recognize this new responsibility are gambling with the prospect of their organization being the next major headline for a privacy violation.