Internet Explorer 11 is not supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

The Long Road to Consumer Privacy Begins on Jan. 1 in Calif.

The state has a new privacy law on Jan. 1 that will attempt to set new standards for consumer data protection. But as tech becomes a norm for shopping, house security, and daily activities, data is hard to regulate.

(TNS) — Starting on Jan. 1, Californians can expect to find a new feature on retail and social media websites -- a message and button or link through which they can instruct site owners not to sell their personal information.

That may be the first hint many users receive of the state's landmark Consumer Privacy Act, which goes into effect on New Year's Day. It's a vital law that sets a standard for the protection of consumer privacy that other states and the federal government would be wise to follow.

But it's only a start. New methods of incursions into personal privacy are being constantly cooked up by businesses seeking profits from snarfing up information about individuals and exploiting it themselves or selling it, piecemeal or in bulk, to others.

The field is "so complicated," says Alastair Mactaggart, the Northern California real estate developer who became the driving force behind the new law, that he has already drafted a follow-on ballot initiative for next November to close loopholes and strengthen enforcement. But even the new proposal may fail to address privacy violations coming at ordinary citizens from all directions.

The law going into effect Jan. 1, which was signed by then-Gov. Jerry Brown in June 2018, survived a year and a half of efforts by Big Business to carve out exemptions and open loopholes.

"We had a constant battle to stop efforts by industry to gut the bill," says Justin Brookman, a privacy expert at Consumers Union in Washington.

Among its major provisions, the law gives California consumers the right to know what personal information retailers, social media platforms, and other service providers are collecting, selling or sharing with others. Consumers can demand that their information be deleted, and can opt out of allowing it to be collected in the first place.

That's the point of the notice and link that must appear on websites that collect any data. The law applies to businesses with annual sales of at least $25 million or that buy or sell the personal information of at least 50,000 customers, households or devices.

Businesses can charge customers who opt out of data collection more for their services, but the price difference has to be related to the value of the information. That should place a tight limit on the price differences, for estimates of the cash value of individuals' data are low.

According to a study done for Attorney General Xavier Becerra by Berkeley Economic Advising and Research this summer, for example, "general information about a person such as their age and gender were found to be worth $0.0005 per person." Information that a woman was pregnant was pegged at about 11 cents per person.

The Berkeley figures were based on estimates compiled by the Financial Times, which concluded that the total value of 61 basic information nuggets often sought by data buyers was about $4.83 on average. Most individuals, of course, don't evaluate their personal information strictly in dollars and cents. Data-collecting companies, however, value the information in the aggregate, in which it's worth billions.

The law directs businesses to safeguard the personal information in their possession, and allows individual consumers to sue for up to $750 per incident, or for actual damages, in the case of a breach. The attorney general can also sue for up to $2,500 per incident or $7,500 per incident if the breach is intentional.

As advanced as the law may be, it became evident well in advance of its effective date that it left shortcomings businesses could exploit, Mactaggart told me; some were the result of the haste with which the law was written, in a legislative effort to fend off a ballot initiative Mactaggart had proposed. For example, a provision allows businesses to "cure" a data breach by implementing security provisions within 30 days after a breach has been discovered.

Mactaggart drafted a new initiative for next year's ballot closing loopholes and adding new protections. If the new measure passes, retrospective cures of security holes won't protect a business from liability. It would triple the maximum penalty, to $7,500, for breaches involving data about children.

The initiative also would establish a California Privacy Protection Agency headed by a five-member commission and given an initial $10-million budget for enforcement (indexed to inflation after the first year).

Perhaps its most important innovation is the addition of special protections for so-called sensitive personal information, such as Social Security numbers and information about a person's religious beliefs, race, sexual orientation, account logins and passwords, genetic data, health or geo-location. "We don't have a concept of sensitive personal information in the law right now," Mactaggart says, "but it's information so sensitive that businesses shouldn't use it unless its absolutely essential."

Take geo-location. In a 2017 Massachusetts case, an advertising agency was caught identifying women going to abortion clinics by tracking their smartphones, and selling the information to an anti-abortion group that inundated them with anti-abortion messages. The ad agency's owner bragged that he could "tag all the smartphones entering and leaving the nearly 700 Planned Parenthood clinics in the U.S.," according to a complaint filed by Massachusetts authorities. The agency agreed to cease the practice.

Mactaggart's initiative would prohibit tracking devices more precisely than within a circle of about 250 acres. (A football field is about 1.3 acres.) "So you won't be able to track how long I've been at the gym, or when I arrived at work, or whether I went to a rehab clinic," Mactaggart says.

The initiative was cleared by the attorney general on Tuesday to move into the signature-gathering stage. About 632,000 signatures will be needed to qualify the measure for November's ballot.

Even if it passes, however, gaps and loopholes will remain. In part, that's because consumers are casual about their personal data, too willing to give it up to merchants or social media platforms without asking how it will be used and objecting when it's exploited against their interests.

Breaches involving Social Security and bank account numbers, birthdates, and other information that could be exploited by identity thieves are so common an occurrence these days that few consumers seem to be stirred. The companies assembling these data and failing to adequately safeguard them are so big and rich that no penalties seem to matter.

In July, the Federal Trade Commission hit Facebook with a record fine of $5 billion for breaching an earlier order over its privacy loopholes -- in other words, Facebook was judged to be a privacy recidivist -- but the penalty was equivalent only to the revenue the company collects in an average month and less than a fourth of its annual profit. In other words, it could be shrugged off as the cost of doing business.

The problem is the domination of our public spaces by what privacy expert Shoshana Zuboff calls "surveillance capitalism." Companies collect data not only from our online habits and retail activities, but from an increasing variety of devices that we invite into our homes or use to secure our property -- internet-enabled thermostats, video-equipped doorbells, smartphones with geo-locating technologies, and more.

Amazon employees have reported that commands issued to voice-activated devices such as the company's Echo have been recorded and stored, and even casual conversations held within earshot of the devices can be swept up.

The company's Ring doorbells can view and record not only visitors to a home's front porch, but passersby on the street. Homeowners have the option to share these recordings with local law enforcement agencies -- though strangers who unwittingly come within range don't have the chance to opt out. Amazon even encourages neighbors to mutually share their Ring videos into a sort of neighborhood watch system through a downloadable app.

Facial recognition can be especially insidious, because it's invisible to the targets. In May, San Francisco banned local law enforcement agency's use of facial recognition, along with other passive surveillance technologies such as automated license plate readers and camera-equipped droned.

And on Dec. 18, five Democratic senators including Kamala D. Harris of California questioned the Dept. of Housing and Urban Development about its installation of facial recognition security systems, which they said "could be used to enable invasive, unnecessary and harmful government surveillance of their residents."

"Unilateral incursion" into personal privacy by companies seeking surveillance profits has spread, Zuboff wrote in her 2019 book "The Age of Surveillance Capitalism," via the collection of personal data "through [Google's] Street View's Wi-Fi and camera capabilities, the capture of voice communications,...the tracking of smartphone location data,...[and] wearable technologies and facial-recognition capabilities."

In other words, ordinary citizens are swimming in a sea of technological eyes and ears, generally without knowing. "Rights to privacy...have been usurped by a bold market venture powered by unilateral claims to others' experience and the knowledge that flows from it," Zuboff wrote. Surveillance capitalism "claims human experience as free raw material."

In that context, laws such as California's, even if augmented with Mactaggart's new initiative, struggle to hold back the tide. Americans are only just beginning to sense the scope of surveillance capitalists -- some users of Fitbit, the exercise tracking device, said they'd scrap theirs after Google bought the company, presumably to gain access to users' personal data. But it's unclear how many would really follow through. (The acquisition is pending regulatory approval.)

The battle for consumer privacy may be a never-ending cat-and-mouse game between surveillance capitalists and lawmakers. If experience serves, the former will always be steps ahead of the latter.

"Policy," Brookman says, "always tends to be slower than technology."

©2019 the Los Angeles Times. Distributed by Tribune Content Agency, LLC.

Special Projects
Sponsored Stories
In recent years, local governments have been forced to adapt to a wildly changing world, especially as it pertains to sending bills and collecting payments.
Workplace safety is in the spotlight as government leaders adapt to a prolonged pandemic.
While government employees, students and the general public had to wait in line for hours in the beginning of the pandemic, at-home test kits make it easy to diagnose for the novel coronavirus in less than 30 minutes.
Governments around the nation are working to design the best vaccine policies that keep both their employees and their residents safe. Although the latest data shows a variety of polarizing perspectives, there are clear emerging best practices that leading governments are following to put trust first: creating policies that are flexible and provide a range of options, and being in tune with the needs and sentiments of their employees so that they are able to be dynamic and accommodate the rapidly changing situation.
Service delivery and the individual experience within health and human services (HHS) is often very siloed and fragmented.
In this episode, Marianne Steger explains why health care for Pre-Medicare retirees and active employees just got easier.
Government organizations around the world are experiencing the consequences of plagiarism firsthand. A simple mistake can lead to loss of reputation, loss of trust and even lawsuits. It’s important to avoid plagiarism at all costs, and government organizations are held to a particularly high standard. Fortunately, technological solutions such as iThenticate allow government organizations to avoid instances of text plagiarism in an efficient manner.
Creating meaningful citizen experiences in a post-COVID world requires embracing digital initiatives like secure and ethical data sharing, artificial intelligence and more.
GHD identified four themes critical for municipalities to address to reach net-zero by 2050. Will you be ready?