Implementing appropriate data privacy is critical for company operations and success. Learn some of the challenges and solutions recommended to do the job right.
Data privacy is more crucial than ever in this era of remote work and home offices. Companies must leverage multi-factor authentication (MFA), complex passwords, idle-session timeouts and strict security controls, while employees should also always lock their screens when not in use, do not permit unauthorized individuals to use or access company equipment, and be given the bare minimum permissions needed to do their jobs.
Good data privacy isn’t just a set of policies and procedures; it’s a philosophy in itself. I spoke to cybersecurity philosopher and implementer Eve Maler, interim CTO at identity platform provider ForgeRock.
Scott Matteson: What are the challenges involved in data privacy?
Eve Maler: The two most important challenges in my opinion are:
The use of private data for making decisions or conclusions through large-scale data processing. We know that processing private data through advanced algorithms, like deep learning, is not deterministic, is known to be biased, and is mostly not completely understood. For example, these algorithms can lead to people being incorrectly tagged as insurgents or becoming targets of influencing campaigns through advertisements. Unfortunately, there are no immediate solutions except banning specific and targeted advertisement campaigns or inhibiting the large scale data processing in much the same way as Amazon, Microsoft, and IBM have done by recently stepping away from promoting, selling, and supporting their facial recognition technologies. Private data is highly prized on the black market, which leads to it being actively pursued by opportunistic attackers for economic benefit. Organizations need to assure a minimum level of attention when asking, storing, processing, and sharing private data as well as securing the environments that involve sensitive and private data. Regulations such as General Data Protection Regulation (GDPR) are a step in the right direction given the penalties when abuse of private data is detected.
Data privacy today involves building a pyramid of solutions. Data protection is the first layer in the pyramid: This is where you work on the security of personal data. The second layer is data transparency: Here you need to inform people what you collected and want to collect about them and how you use it. The third layer is data control: Giving consumers choice and authority over what is collected about their own lives.
Regulations, such as the EU GDPR and the California Consumer Privacy Act (CCPA), have been enacted in order to hold companies more accountable than ever before for providing greater protection, transparency, and control to consumers over personal data.
Scott Matteson: What are the solutions available?
Eve Maler: There are four steps companies can take to strengthen consent management and make a difference to earn trust when handling precious consumer data in 2020:
Identify where digital transformation opportunities and user trust risks intersect. Users are more skeptical these days, but organizations analyze and use those trust gaps to discover new data privacy opportunities for their consumers as they look to digitally transform. Consider personal data as a joint asset. It’s easy for the risk leads within a company to say data subjects own their own personal data, but business leaders have incentives to leverage that data for the value it brings to their business model, which changes the equation. All the stakeholders within an organization need to come together and think about data as a joint asset in which all parties, including consumers themselves, have a stake. Lean into consent. A business often will have a choice to offer consent to end-users rather than just taking data. Seek to offer the option to the end-user—there are benefits when building trust with skeptical consumers. Take advantage of consumer identity and access management (CIAM) for building trust. Identity management platforms, automate and provide visibility into the entire IAM lifecycle, all while allowing end-users to retain the controls to manage their own profiles, passwords, privacy settings, and personal data.
Scott Matteson: How does data privacy differ for consumers versus businesses?
Eve Maler: Consumers love taking part in the connected world, but to do so, they must share personal data. Millions of people are unaware and uninformed about how their personal information is being used, collected or shared in our digital society. Regulations like GDPR and CCPA put a premium on gathering consent from individuals, empowering them to take control over their data.
Consumers should protect their own private data by not sharing too much or at least having the reflex to think whether something is worth it if they do. There is no such thing as free beer! If it is free, you will pay for it in some way, and mostly this comes back to sharing personal and private information about your persona.
For businesses, implementing data privacy regulations, such as GDPR or CCPA, should be viewed as an opportunity to build trust with consumers. Data transparency and data control enhance the relationship businesses have with consumers. Businesses should deploy comprehensive identity management and robust consent management systems to ensure there are not only mechanisms that act as their first line of defense for protecting consumer data, but also strengthen the bonds of digital trust for all service users.
Businesses should respect and take care of all information that is requested from private persons. Only ask and store what is really necessary, document the processing of that data, and openly communicate it. Businesses need to build a trust relationship with their customers and that will, in current times, only work if customers feel you are treating their data with respect. Again, regulations such as GDPR go a long way in requiring this from organizations that store and process data from EU citizens.
Scott Matteson: I myself have advocated for years that people not take those social media quizzes to see what kind of flower they are, or which reveal private information to others. I’ve also instructed my kids never to create online accounts with any personal data such as their date of birth. Speaking of social media, I see a growing trend among many to abandon social media entirely. Is it too drastic of a step? If so, what are some options people can take to be able to use social media with minimal risk?
Eve Maler: Social media can be toxic and can be abused for influencing. Let’s not forget the roots and the reason for success of most social media platforms: The curious nature of humans. We always like to know everything and more, and coupled with our need to replace the lack of social contact in our busy and fast paced lives, it has led to booming online communities. These platforms have enabled more global interactions and fast-paced news and communications, which consumers have become accustomed to. When they need information, that information is preferably personalized and consumed on-demand in a very convenient way—hence the success of advertisements, influencing campaigns, and news on social media platforms such as Facebook.
This year, governments are going to continue to put pressure on the tech giants, which will respond by trying to self-regulate to overcome increasing laws that threaten their business models. The privacy hits are going to continue for social and tech giants, and they are going to continue to prove that they don’t deserve consumers’ trust.
The big social networks have more to fear than privacy laws. Greater attention will be paid to dark patterns in 2020, which will encourage legislators and regulators likewise to pay broader attention to antitrust and consumer protection threats. Consumers will not leave their social networks, but we’ll see increased consumer protection laws as a result.
As consumers move toward a personalized experience while seeking a real measure of privacy, they should take advantage of the privacy options that their social network provides and be especially careful about connecting third-party applications.
Scott Matteson: How will privacy concerns/remedies be shaped in the future?
Eve Maler: The dynamics of this age-old debate are complex, but without opposition calling for greater regulation we would be looking at a future where there is little to no privacy and all of our thinking is influenced by a select group of people. Even YouTube streamers nowadays call themselves influencers. This simple change in how people refer to themselves clearly reflects a greater trend for the future, and we should be aware of that and try to keep a balance of customized news without providing too much insight into ourselves and risk inviting influencing information. I’m still hoping for a future where responsible people are leading the world and doing good for their citizens by protecting them from too much intrusive privacy and social tracking.
Privacy remedies and concerns have already begun to shape how enterprises feel toward improving data inventories and data hygiene controls. The US lacks a digital single market around privacy laws. As a result, we are suffering, and we are under pressure to create better regulatory efficiencies. A unified federal-level push to regulate privacy is coming, essentially a US-wide version of the digital single market goal of GDPR, extending outward from CCPA.