Privacy Today is coming soon!
Unless you have been living under a rock, the recent woes experienced by tech and social media giant Facebook have dominated news cycles on days when chemical weapons, nuclear proliferation, and North Korean summits might otherwise capture the attention of the American consciousness. The headline:
Facebook Doesn’t Care About Users’ Privacy
(and they probably never did).
In fact, they misappropriated the personal data of more than 85 million citizens, and then took a couple of years to admit it.
The blunder has not gone unnoticed by other tech titans. On a Saturday afternoon Apple CEO, Tim Cook, at March’s Beijing China Development Forum, offered his hot take on the situation at hand and how data affects human lives:
“This certain situation is so dire and has become so large that probably some well-crafted regulation is necessary. The ability of anyone to know what you’ve been browsing about for years, who your contacts are, who their contacts are, things you like and dislike, and every intimate detail of your life; from my own point of view, it shouldn’t exist.”
Needless to say, Zuckerberg has taken a beating; so too has Facebook shares, suffering a precipitous drop of 4.4% since the unveiling of the Cambridge Analytica scandal. While Facebook users are now looking more closely at their app settings, there has been a noticeable exodus from the platform for many users. Millennials were less enthralled with Facebook than the newer social media platforms like Instagram (also owned by Facebook) and SnapChat, so a loss of active users as a result of these privacy concerns are a black eye on, arguably, the world’s most visible social media platform. And it is costing them money in not only market value, but also ad sales.
Ransomware attacks are among the most serious and prevalent threats for data. Ransomware is best understood as a type of malicious software that intends to either publish of block access to information until a “ransom” is paid. While ransomware attacks have increased in complexity, and the ability to reverse them along with it, encrypting files and making them inaccessible until the ransom payment provides real problems for organizations that store massive amounts of personal data. One of the latest attacks was on Rochester-based (Minnesota) Associates in Psychiatry and Psychology (APP) on March 31, 2018. The ransomware attack affected patient information for 6,546 individuals; thus far, it appears that the information was not in a “human-readable” format and that the protected health information wasn’t accessed or copied by the attackers.
Ransomware attacks like this speak to the need for information governance and vital records programs. While there isn’t an exhaustive list of information potentially accessed, it likely included:
APP had a prompt response to the attack, taking their systems offline. Doing so in a timely manner likely stopped the spread of the attack and limited possible encryption of personal data and data theft, completing the “ransom” aspect of the ransomware attack.
Elon Musk and Tesla were in the news once more. Recent tweets about taking the company private with secured funds sparked some head-scratching and questions about what implications a tweet might have if deemed as a false statement in the pursuit of increasing the worth of Tesla.
Public tweets by principals of publicly-traded companies about finances pose interesting questions regarding compliance. Does sharing information about bankruptcy or the funds necessary to take a company private constitute some kind of breach in the eyes of the SEC? That remains to be seen, as they have been mum on the topic thus far. However, were Musk or Tesla unwilling, or unable, to provide regulatory filings that outline a deal that would allow them to go private, it poses a quandary for how compliance regulation is carried out in this ever-evolving business landscape.
Either way, we bet a lot of folks are wishing they had some Tesla stock right about now.
News of this privacy breach comes in the wake of the outing of Cambridge Analytica and the role they may have had in politically-oriented ads. To hear Facebook speak about it publicly, they have been trying to combat the rise of fake news (despite famously denying they played any role in the election).
Ungoverned, false information that lingered because of the clicks it created, and not whether or not the story has been appropriately sourced and verified, is the primary cause of the rise of fake news that plagued Facebook and spilled out into casual conversations. Fake news is best understood in the context of giving oxygen to an idea regardless of its veracity: such stories were likely accelerated and allowed to fester as a result of these ineffective fact-checking efforts. Without access to the results of these third-party editors, we cannot understand the effect in its totality.
You might be wondering why we can’t apply more pressure to turn over the data or at least share the full results. The reason will irritate you. We can’t do much about it is because, as a private corporation, they are not obligated to release data. That is why GDPR-like legislation is being proposed in the U.S. on a state and national level. Facebook claims that in not revealing this internal data, they don’t risk revealing private user data––which is hypocritical at best, given what we know now about the Cambridge Analytica scandal.
Can we genuinely believe that Facebook is really prioritizing privacy as their new
ads suggest?l.
In November 1999, as the rest the United States dealt with the cultural and cult-prophetic aspects of Y2K, the Institutes of Medicine published a report titled: To Err is Human: Building a Safer Health System. The report began with these alarming words, “As many as 98,000 people die in hospitals each year as a result of medical errors that could have been prevented.” Reading that people die because of human error in the very places that should be curing us is quite jarring. The reality is that humans do make errors. Even in life and death situations, humans cannot escape their “humanness.”
This basic human condition was an underlying point of the report. Standing on its own, To Err is Human was also a call to arms. It is probably not a coincidence that building a safer American health system would have been visionary in 1999. After all, this was a time of uncertainty. The dot-com bubble had burst, and the “end of the world” dogma surrounding Y2K was at its peak. So many Americans felt that the end was near. However, at the close of the twentieth century, some computer scientists saw the benefits of electronic information to the healthcare industry. Although most of the now famous To Err is Human report focused on human-caused errors, the report also listed how and what computers could do to help humans make less mistakes.
Without actually saying it, To Err is Human can be considered an early attempt at conceptualizing potential IG principles that would reduce fatal medical errors. Donaldson, Corrigan & Kohn (2000) promoted a “system-oriented approach” that “involves a cycle of anticipating problems” and “tracking and analyzing data as errors and near misses occur.” The identified data could then be used inside the system to “modify processes to prevent further occurrences.” However, it was not the U.S. that first promoted IG in healthcare; instead, it was the U.K.’s National Health Service that introduced the NHS IG Toolkit to its professionals in 2002-2003.
Analytics is becoming a competitive edge for organizations. Once being a “nice-to-have,” applying analytics is now becoming mission-critical.
An August 6, 2009, New York Times article titled, “For Today’s Graduate, Just One Word: Statistics”[1] reminds me of the famous quote of advice to Dustin Hoffman’s character in his career breakthrough movie, The Graduate. It occurs when a self-righteous Los Angeles businessman takes aside the baby-faced Benjamin Braddock, played by Hoffman, and declares, “I just want to say one word to you—just one word—‘plastics.’” Perhaps a remake of this movie will be made and updated with the word “analytics” substituted for plastics.
The use of analytics (that include statistics) is a skill gaining mainstream value due to the increasingly thinner margin for decision error. There is a requirement to gain insights and inferences from the treasure chest of raw transactional data that so many organizations have now stored (and are continuing to store) in a digital format. Organizations are drowning in data, but starving for information.
“The application of analytics is becoming commonly accepted, but will senior executives realize it?”
The application of analytics is becoming commonly accepted, but will senior executives realize it?
How do executives and managers mature in applying accepted methods?
Managers today are maturing in applying progressive managerial methods. Consider this. Roughly 50 years ago, CEOs hired accountants to do the financial analysis of a company, because this was too complex for them to fully grasp. Today, all CEOs and mainstream businesspeople know what price-earnings (PE) ratios and cash flow statements are, and that they are essential to interpreting a business’ financial health. They would not survive or get the job without this knowledge.
Twenty years ago, CEOs of companies didn’t have computers on their desks. They didn’t have the time or skill to operate these complex machines and applications, so they had their secretaries and other staff do this for them. Today, you will become obsolete if you don’t at least personally possess multiple electronic devices (such as laptops, mobile phones, BlackBerrys, and PDAs) to have the information you need at your fingertips.
Files coming soon.
Sign up to get each issue delivered straight to your inbox.
Have a good topic or story idea for us? Would you like to write for us?
Send us a message and let us know what you are thinking about.
Copyright © 2019 Privacy Today & InfoGov World Media LLC- All Rights Reserved.