“I’m increasingly inclined to think that there should be some regulatory oversight, maybe at the national and international level, just to make sure that we don’t do something very foolish. I mean with artificial intelligence we’re summoning the demon.” – Elon Musk
We do not know the full power of artificial intelligence (AI) yet. Further, I don’t think we fully understand what the future holds for us concerning AI. I think this is fair to say we have leading minds on both sides of the issue. For instance, people such as Stephen Hawking, Elon Musk, and others suggest we enter into a doomsday scenario.
Nick Bilton, a tech columnist for The New York Times, wrote that we could find ourselves in a situation where a medical AI that is programmed to eliminate cancer decides that the way to do it is by exterminating humans who are prone to the disease. There are others, such as Shawn Olds and Andrew Ng, who believe in the benefits of human and artificial intelligence partnership.
Candidly, I’m not an expert in artificial intelligence and what it could bring to humanity. I’m not sure even the global scientists, engineers, and futurists have a full sense of where we’re heading with AI. Nevertheless, all of us who live on this planet have to deal with whatever is to come. Because artificial intelligence is so powerful, and we know that it will outstrip the most intelligent human’s capacity to process information, we would be foolhardy not to try to understand the topics that affect us.
One of the issues brought to the fore with artificial intelligence is privacy. Here I’ll explore the drive to increased privacy after a period where it seemed that privacy was dead. Such was the sentiment in 2010 of tech CEOs such as Facebook’s Mark Zuckerberg, Google’s Eric Schmidt, and Sun Microsystem’s Scott McNealy. However, today, we have Zuckerberg discussing a privacy-focused future on Facebook.
So, what changed? The answer is a lot, including bots and tech, that stole personal information and led misinformation campaigns. As a fundraiser, I understand all of these matters impact nonprofits, and the trend is leaning—hard—toward privacy and donor data protection.
Technology Goes Mainstream
As I mentioned, there was a time when the top tech CEOs were claiming that privacy was finished. Remember, communication and sharing platforms such as Facebook have only been with us for a little over a decade and a half. Facebook, which is the dominant social networking platform globally with 2.4 billion users, was only founded in 2004. At the time, Mark Zuckerberg was at Harvard University, and he developed it with his fellow students and friends, Eduardo Saverin, Andrew McCollum, Dustin Moskovitz, and Chris Hughes.
Google, which is the most-used search engine globally, was founded in 1998 by Larry Page and Sergey Brin while both were Ph.D. students at Stanford University. The origins of what became Google began as a research project. At the time, search engines, which were not widely used, provided results based on how many times a search term appeared on a page.
However, Page and Brin believed that they could create a better system to rank the relationships between websites. They developed programming that offered results based on the number of pages of the site and also the number of backlinks to the site, which meant it was an authority in its subject.
The new system was initially called BackRub because of the backlinking. Thankfully, they changed the name to Google. BackRubbing doesn’t work. In the early 2000s, Google gained broader traction and then took off to become what it is today as it became accessible to people beyond government and businesses. Simultaneously, personal computing hardware grew from what it was in the few millions in the early 1980s to become a genuinely mass-market product as many more millions of families brought computers into their homes. During the decades of the 1980s and 1990s that technology began to integrate into people’s lives. In the early 2000s, companies such as Google and Facebook, communication, and sharing in the digital world took off, connecting people around the planet with each other.
Looking back at the early years of platforms such as Facebook, Google, and email communications, many people were excited. They had at the tips of their fingers vast amounts of information and knowledge, so they no longer had to travel to libraries to obtain research and have access to information. Everything from news to shopping to their friends and influencers was accessible to them online.
People and businesses could communicate quickly, efficiently, and at a much lower cost than in the past with travel, print, and mail. As exciting as those early years were, inevitably, there were unforeseen events that happened as the world became almost entirely digitized. It was only a matter of time when privacy would be a concept we would have to defend, and today, that’s where we are as a society.
21st Century Data Breaches
To understand where we are, including fundraisers and nonprofits, with donor data protection, we have to take a look at what happened in souring the public on the idea that privacy was dead. The 21st Century seems to be full of data breaches on everything from banking information to sensitive passwords and social media accounts. This steady drip of violations on the personal information of tens or hundreds of millions, or even billions of people, has created the impetus for governments and companies (hello Apple) to pay attention to data protection. The public demands it, and because of it, governments are responding.
Explaining what occurred with a few of the most massive data hacks of the 21st Century could be a book in and of itself. Therefore we’ll only explore a few of the most notorious hacks that I think soured the public on the idea that their data and personal information were no longer theirs.
Again, understanding these events helps nonprofit leaders understand that privacy is essential for the public and donors. Furthermore, suppose nonprofits do not know how a data breach at their organization could lead not only to angry donors but also liability and the end of their charity. In that case, they probably should not be in operation anyway.
The most significant data breach, so far, affected 3 billion users around the world and happened in 2013 and 2014. When this came to light, Yahoo was in the midst of negotiations to sell the company to Verizon. Instead, during a sensitive time, Yahoo found itself publicly announcing that it had been hacked—the largest in history—by a “state-sponsored actor.”
The information stolen included names, birthdays, telephone numbers, emails, and passwords for its users. Ultimately, that admission, which started with the company admitting 500 million users affected, was followed by another stating that 1 billion people were affected to 3 billion. It sold for $350 million lower because of the initial admission.
As most adults in the United States know, Equifax is one of the largest three credit bureaus. It is a company that has the data of millions of people and reports on their creditworthiness. Because personal background checks in the US often include credit checks, the information it holds is highly sensitive and impacts millions’ job prospects.
So, it was disturbing for the public to learn in 2017 that the company got hacked and their information compromised. Thieves stole personal data from Equifax, including birthdates, addresses, drivers’ license numbers, and Social Security numbers from 143 million people. Also, 290,000 people had their credit card information exposed.
3. Ashley Madison
In July 2014, the world experienced a different kind of hack that shocked and caused a massive global debate about privacy. In this case, a group that called itself “The Impact Team” hacked and stole the data of users on the Ashley Madison site. Ashley Madison was a website used by people who purportedly wanted to have extramarital affairs.
What made this hack so shocking and electric in the public consciousness is that The Impact Team threatened to expose Ashley Madison users’ names. These people were presumably having extramarital affairs, and the hackers said if the company did not shut itself down, the clients would get exposed.
Ashley Madison had a policy of not deleting any user information, including real names, addresses, credit card information, and other information. People who were users of the platform and who presumably did not want their spouses, colleagues, friends, or families to find out about their affairs dreaded the exposure of their information and any public shaming that would follow. Unfortunately, some people killed themselves, including a pastor and professor at New Orleans Baptist Theological Seminary.
While this caught the public’s collective imaginations, millions of people who did not use Ashley Madison and its users said that users had a right to privacy. The world debated the ethics, morality, and issues of confidentiality around this particular hack. It was one of the hacks that demonstrated to the world that anyone could get hacked. No information was private, and essentially, this could happen to anyone concerning anything.
4. Facebook Cambridge Analytica
In early 2018, the news exploded with what became known as the Facebook Cambridge Analytica political scandal. Cambridge Analytica harvested the personal information and profiles of as many as 87 million people who used Facebook without their consent or knowledge. That information got used for political advertising.
Consequently, it was this harvesting of their data that genuinely frustrated and angered the public. The scandal is viewed as the catalyst moving the public toward the idea of a right to their data, its protection, and knowledge of how it is used.
When The Guardian exposed this event, Facebook did not comment on how millions of users had their information harvested for political purposes by a third party, Cambridge Analytica, without their knowledge. In the meantime, Facebook’s stock price fell (it lost over $100 billion in market capital), and people spoke of the need to regulate the tech industry. In the minds of many, tech companies had too much power and, for the most part, have become massive corporations with little, if any, regulation or government oversight.
As much as Facebook declined to comment, in March of 2018, a whistleblower who worked at Cambridge Analytica by the name of Christopher Wylie came forward publicly. By the 1st of May 2018, Cambridge Analytica closed its doors. However, by this point, the public understood how faceless technology companies—practically—owned everything about them.
Private companies had everything they needed to advertise or make money off through data mining. It includes names, addresses, credit cards, Social Security numbers, emails, credit history, work history, names of friends and other relationships, preferences for everything from movies to what they search for on the internet, etc. Perhaps privacy is dead. Nevertheless, companies and governments seek to contain some of the adverse effects of existing in a digital world.
General Data Protection Regulation
In May of 2018, the European Union law called that General Data Protection Regulation (GDPR) came into effect. It is the world’s first attempt at regulating and protecting data. It is important here to say that you would be incorrect if you think your nonprofit is not affected by Europe’s law. The GDPR intends to protect any digital information of people who are European citizens or residents.
However, the reach of the GDPR is global. Meaning, if you happen to have information on your donor base of people who live in one of the European Union countries, you need to make sure that you are not placing your group at significant risk. People who live in countries within the European Union have the “right to be forgotten.” What that means is that if your nonprofit or any entity for that matter has information about them, you have to wipe it out—completely—if they request it from you. Notably, if you do not take care in protecting their data, you can be held liable.
Noncompliance with the GDPR, anywhere in the world, can mean a fine of as much as 4 percent of the total annual revenue. In other words, safeguarding donor data is something that all nonprofit leaders should do from a moral, ethical standing, and on principle. But, not doing it can also bring stiff penalties.
At this point, I think it’s important to remind you that you may have in your database the information of people who live in Europe, and you might not even realize it. As you know, we live in a globalized world. That means that people can see information about you, through your website or social media, from anywhere.
Don’t be surprised if you have one, two, or a few people who reside within the European Union. Donors and people interested in a particular cause are not limited to the country’s borders where they live because of the internet and social media. So, always, always protect the information you have in your database in response to far-reaching laws. It includes the GDPR and other laws inside the US, which now exist, with more to come in the years ahead.
The GDPR information you must protect includes names, addresses, telephone numbers, emails, Social Security numbers, messages back and forth, and even IP addresses.
Privacy and Data Protection Regulation in the USA
Lest you think that data protection and privacy regulations and laws are only happening in some other country or the European Union, you need to pay attention to the United States. California became the first state in the nation to have a law go into effect concerning privacy.
In January of 2020, the Consumer Privacy Act became law in California, and it is considered the “GDPR lite.” In other words, it looks a lot like the GDPR, but the penalties for violations are a lot less. Consumers, or donors, can sue the company or nonprofit for $750 per violation. Further, the State of California can penalize the organization for up to $7,500 for each penalty. While this is not 4 percent of the total annual revenue, if a nonprofit gets charged with violating this law, these penalties can still be significant.
In August 2018, I wrote an article, “Donor Data Is at Risk in the Nonprofit Sector: What Can Be Done?” in Nonprofit Pro that bears repeating in this post. The US Constitution enshrines privacy into law. Meaning, privacy is not a new concept and is something that our Founding Fathers thought about judiciously as they drafted and then signed the Constitution and the Bill of Rights. As I noted in the article,
The reality is that we all have a right to privacy in the United States, even if it is not expressly stated in the Constitution. However, the Bill of Rights does protect privacy in the 1st Amendment concerning religion, the 3rd Amendment regarding the home, the 4th Amendment related to unreasonable searches, and the 5th Amendment concerning the right to protect oneself against self-incrimination. Today, our leaders and politicians continue to try to guide the nation in that spirit, especially when their constituents, to whom they are beholden, become more vocal about the issue. It’s fair to say that the public is tired of having their private information compromised, exposed, or shared, even if it is with regulators.
California may be the first state in the nation to create a “GDPR lite” law to protect privacy, but it certainly will not be the last. Other states, such as Nevada, Washington, and New York, have also moved to protect data privacy, and others are expected to follow. As an example of what states are considering, New York introduced a bill that would allow New York State residents to sue companies if their data protection is violated and suffer injury.
Privacy has become a fresh idea once again. Concurrently, companies such as Apple have positioned their brands around the concept. As a result, Apple states that privacy is a human right. Thus, it has positioned privacy as a competitive advantage over other companies in its products’ marketing. They’re betting that privacy makes good business sense, but it doesn’t end there. It also makes good sense to earn more profits.
Using Nonprofits for Fraud & Prevention
Nonprofits have highly sensitive information about their donors. It would be a mistake to think that they are exempt from protecting that data. Not doing so can be costly in many ways. MobileCause, which provides online and event software, has an excellent approach to safeguard donors’ information.
In their nonprofit security page, they offer some tips that nonprofits have to be aware of for their groups. By the way, they also note—correctly—that smaller nonprofits are prime targets for hackers because they lack the resources to understand and execute comprehensive security efforts. However, even the smallest nonprofits cannot ignore protecting donor information. As a lawyer, I can tell you that those who don’t secure data run a risk of losing their nonprofit status because of the bad press and regulations.
As is explained in the MobileCause security page, “card testing” increased by 200 percent in 2017. Card testing is a fraudulent activity. Hackers test stolen card numbers and see if the cards remain valid. It is done with automation that attempts to make donations to nonprofits for small amounts. If the charge goes through, hackers get alerted that the card is active, and they then move to charge higher costs. As a result, nonprofits are a testing ground for hackers and fraud with credit cards. Hackers like doing this testing through nonprofits because they often lack the controls to pick up on fraudulent activities.
MobileCause’s security page notes that every nonprofit should ensure the following through its platforms:
- Certified PCI DSS Level 1, which are global security standards
- Fraud protection against card testing and other criminal activities
- Banking and credit card security
- Two factor-authentication for passwords
MobileCause is, of course, just one of the platforms providing nonprofits with privacy and security. There are others, but the National Council of Nonprofits offers an excellent source for more information on data and security. They have plenty of resources that I would suggest you take a look at concerning nonprofit cybersecurity.
Forthcoming Databox Revolution
Now that you have a high-level understanding of data and security, I think it’s important to talk about a revolutionary idea. Imagine a way for you and your donors to control all of your online data. It exists already, and I predict that it will be something that will become mainstream—including with donor data—in the not too distant future.
A databox holds the personal information of a person. It is one way that technology companies are creating solutions to protect consumers who are demanding protection. When you place your data into a databox, you control it. Then, you provide third-parties with access to the information that you want them to have about you. Only authorized third parties will receive information from individuals through databox platforms.
As explained in an article in the BBC, the idea for the databox originated in a school of thought called “Human Data Interaction (HDI).” Meaning, that personal information gets viewed as an object in its own right, rather than a derivative of technology. In other words, data is considered an asset in this school of thought or even a currency or commodity. The idea for a databox stems from the “crisis of trust” that exists in the current environment. And in response, it allows individuals to take full control of their information and, through approved applications and platforms, share the information with others when and how they want.
As the same article explains, the following are the three underlying principles for the idea of a databox:
- Legibility recognizes that data ﬂows and data processes are often opaque to individuals and are therefore concerned to make data and data processing, including algorithmic operations, transparent and comprehensible, or accountable to users.
- Agency recognizes the need to empower people to manage their data and third-party access. This includes the ability to opt-in or opt-out of data processing and the broader ability to engage with data collection, storage, and use and understand and modify data and the inferences drawn from it.
- Negotiability recognizes that data processing is essentially a social act involving not only computers but also human actors. This includes organizations as well as individuals and requires that people be able to manage the social interactions implicated in data processing and that they can derive value from data processing operations for themselves.
At present, we all operate in a cloud-based world. As a result, that means that all of our digital interactions go somewhere—up into the so-called “cloud.” We then trust that our information is kept safe and protected. It’s incredible to think about how much data we give away on any given day.
For instance, we input our Social Security numbers, names, emails, addresses, dates of birth, and a whole host of other information into websites and other platforms. Then we trust that a third party will be ethical and have the resources to protect our most sensitive information. We share our pictures on social media and social networking sites, those of our children, photos of our homes, vacations, purchases we’ve made, or places where we’ve donated. All of this is a dream for marketers as the data becomes mega-data, parsed and analyzed to receive more marketing and advertising so that information can be sold to yet other third parties for more commerce.
All the while, we have zero ability to control our data, except for not participating in a digital world, which is not practical for most of us. All we can do is trust that nothing happens that will hurt us in any way. The public understands that their information is vulnerable—even in the corporations that should have the resources to have the latest in security. So, the idea of a databox, which moves our personal information off third-party platforms back into our control to some extent, is appealing to the public and will become more so as people understand their security options.
Databoxes allow for people to hold their private information in their personal ecosystems. Incidentally, as the idea of databoxes grows, people will have greater control of the information they share when corporations, businesses, and yes, nonprofits can only obtain encrypted code or information about individuals. Again, this is a revolution in the making, but I believe millions of people think it is way overdue. Yet, at this point, you might be wondering, what does that mean for my nonprofit? Why should I care?
There are several reasons why the idea of databoxes will affect your organization. For starters, the notion of privacy will again ingrain itself in the collective consciousness of people. For many years, the public was conditioned to give and share their most sensitive information freely. Transparency is a word that you hear repeatedly, and a lot of it comes from technology companies, for example, which are not transparent. If they were, we might understand how their algorithms work and how they encourage our behaviors to keep surrendering information.
However, as consumers become used to more data protection with databoxes and other technology created to protect the information, they will expect it from nonprofits. Thus, organizations that do not adhere to data standards stand to lose fundraising dollars.
Let me ask you the following question. As databoxes and other platforms for security and privacy become more mainstream and ubiquitous, and more companies and nonprofits sign on so they can retain business and fundraising dollars, do you think donors will give to those groups that don’t adhere to basic privacy and donor data protection standards? I would say that the chances are high that donors and the public won’t do business or donate to nonprofits that don’t care enough about their data to protect it.
Let’s take it one step further. Let’s assume that the idea of databoxes and privacy technology gets well-incorporated into society, just as social media has done for about a decade. Finally, let’s assume, all of the data sold back and forth, including nonprofits for acquisitions or donor profiling, gets harder to come by. What happens to the fundraisers?
We know that artificial intelligence is a game-changer. Full stop. What happens when privacy and donor data protection become the norm and not just the competitive advantage for organizations? What happens with the information that fundraisers, especially major gift officers, seek to obtain so they know how and when to cultivate and make asks of donors?
Again, we’ll find ourselves in a situation—in short order—where tech-savvy donors and prospects push nonprofits to adhere to their standards. Wealthy donors may be the first to protect much of their personal information in databoxes. However, so will people such as Generation X, who were raised understanding and appreciating privacy.
Of course, younger generations may also do the same. As an example, they already seek to find private platforms, such as Telegram or Signal, for communication. Because they share information on social media, don’t assume that Millennials and Generation Z do not appreciate privacy and data security.
How things will unwind is not yet answered. Unfortunately, I don’t have all the answers to all of the questions around technology and how it will impact us. As I’ve mentioned, I’m a lawyer and fundraiser by training, not a fortuneteller with a crystal ball or a genius futurist. However, I’m already thinking about these things. I’m speaking to thought leaders in technology in the nonprofit and law sectors, trying to understand what is just over the horizon.
Things are moving fast, blazing fast in many cases, and it’s all because of technology. Everyone, especially marketers and fundraisers in the nonprofit sector, has to pay attention and look ahead. With the snap of a finger, it seems that tomorrow becomes today. Things we did not think possible only a few years ago are already a reality, and we know that the rate of growth of technology is exponential. All of this, as well as the hacks, is fueling the desire to return to some form of privacy.
For the time being, as a nonprofit leader and fundraiser, you should understand the current regulations, such as the GDPR and the new ones in California, New York, Nevada, and Washington. Look at the information and data you have, and think about how to secure and protect it. Ask a major donor to help you with that effort if you don’t have the financial means to ensure you have the highest protection standards for the digital information you keep. Also, keep an eye on the evolution of technology, such as with databoxes and other tools that will change the landscape of how organizations gather and process data they receive from their constituents.