Why you should care about your loss of privacy
You enter a bakery to buy a roll. The clerk insists on collecting your telephone number, your email address, your five most recent texts, your photos, your contacts, your calendar, and other information from your phone. You'd find yourself another bakery, right? A video produced by the Danish Consumer Council presented this scenario.
Many of your smart phone's apps get access to this information and more. The Guardian's Alex Hern reports in a September 10, 2014, article on a study conducted by the "pan-government" Global Privacy Enforcement Network that found 85 percent of the 1,200 apps examined did not disclose the personal information they collected about their users or how they used the information. One-third of the apps requested an excessive amount of permissions, according to the organization.
The providers of digital services collect, reuse, and sell the personal information they collect about us -- without our knowledge or informed consent. They consider the information a natural resource that's theirs for the taking. Consider the metaphors we use to describe how personal data is collected and used: data mining, data streams, the cloud.
But this isn't some inert material they're commercializing. This is us. You can't use a digital, Internet-connected device these days without being watched and recorded. Services are invisibly collecting information about everything we do and say online. They convert our personal information into profit. It's as if a guest we invited into our homes is wearing a hidden camera to record and use what they find out about us.
We couldn't stop the collection if we tried. For example, Verizon has been criticized by privacy advocates for using a non-deletable supercookie to track its users, as Natasha Singer and Brian X. Chen report in a January 25, 2015, article in the New York Times. Verizon sells the information it collects about its paying customers to advertisers. At least one online advertising network, Turn, was using the Verizon supercookie information to regenerate its own tracking cookies after users had deleted them. There's no question the advertisers are going against the wishes of the consumers not to be tracked.
The metaphors we use for data make a big difference
In a January 20, 2015, article in The Atlantic, Tim Hwang and Karen Levy explain why the metaphors we use to describe data affect the laws and policies regarding data collection. When confronted with something new, something they don't understand, judges and regulators rely on analogies to known quantities to guide them: Is the Internet like a public utility that must serve everyone equally (net neutrality)? Is a website like a hotel or restaurant that must be accessible to people with disabilities? Is your cell phone an extension of your body and thus protected from unwanted and unwarranted intrusions?
Comparing our personal information to a natural resource like oil or iron ore depersonalizes it. All these individual data points are the details of our lives. We have a right to own them, control their dissemination, and deny them to anyone at any time. At present, the only way to prevent the collection, reuse, and resale of our personal information is to stop using any network-connected services. Of course, the only way you're able to read this is by using a network-connected service.
Maybe there's a middle ground. Maybe we can exert limited control over our personal data by being more discerning in the services we use, and how and when we use them. Here are some suggestions:
1) Stop using Facebook. My wife loves Facebook. For her, the Internet and Facebook are nearly synonymous. She has decided that any potential loss of privacy is a fair tradeoff for the benefits she realizes in following and communicating with her family and friends all over the world.
But recent court decisions have convinced me that it's time to give the social network a rest. Earlier this month a Florida state court of appeals ruled that the photos on a "private" Facebook profile had "little, if any, right to privacy." Casey Perkins and Robin Perkins of Snell & Wilmer write about the case in a January 23, 2015, article on JD Supra Business Advisor. The plaintiff in a personal injury suit against Target removed 30 photographs from her Facebook profile shortly after they were discussed during her deposition.
The appellate court upheld the district court's ruling that "photos posted on social networking sites are neither privileged nor protected by any privacy rights, despite the use of privacy settings," according to the article's authors. State courts in Michigan and New York, and the U.S. District Court for the Central District of California have made similar rulings about the lack of privacy protections for material posted to social networks, despite the use of the services' privacy settings.
That pretty much dispels any notion of privacy on a social network. It doesn't matter that your use of the service's privacy settings indicates your intention to assert your right to privacy, if you use a social network, you are waiving all rights to prevent the information you post from becoming public knowledge.
To its credit, Facebook is fighting in New York state courts for the right to assert Fourth Amendment protections against unwarranted searches and seizures on behalf of its users. As James C. McKinley Jr. writes in a September 25, 2014, article in the New York Times, Manhattan District Attorney Cyrus R. Vance Jr. has asked the court to find that Facebook is merely a "digital storage operation" that lacks standing to challenge search warrants on behalf of its users; only the users themselves may do so. The case involves charges of false disability claims by police officers, firefighters, and other civil servants. The DA sought Facebook photos showing the claimants involved in rigorous physical activities.
It appears to be impossible to use a social network privately. Maybe you're okay with this. For the time being, I'm going to hold out for a more-private alternative.
2) Sign out of services when you're not using them. I check out Google News regularly, and each time a small window appears instructing me to sign into a Google account "to get news on topics you care about." The message is reassuring: it means Google isn't tying my online activities to a specific account, just to my IP address and other browser-header information. (I'm able to send and receive via my Gmail account by accessing it through my Outlook.com mailbox; this CNET article from 2012 explains how.)
Google knows so much about the world that it can predict the stock market. Company executives decided back in 2010 that doing so would be wrong. Michael Nielsen describes Google's predictive capabilities in a long article on the BBVA Group's OpenMind site. The upshot of the article is that a database of all the world's knowledge may indeed by created, but when it is, it will be owned by a few very large Internet companies: Google, Facebook, Twitter, and Amazon top the list.
Many industry leaders are calling for the creation of a public, open, not-for-profit data infrastructure. Nielsen identifies two obstacles to such a public data platform: nonprofits are risk-averse, and they don't pivot (change strategies on the fly). Risk-taking and willingness to pivot are essential characteristics for any Internet organization.
Again, the benefits of having Google, Facebook, and other Internet services tracking your online activities may be worth sharing your personal information. I'd rather limit what the companies know about me as much as possible. It's really a meaningless gesture, since the services still have quite an extensive dossier on me, but at least it expresses my intention to retain as much control as possible over my personal data.
3) Encrypt. Data encryption requires a concerted effort at present, but it may soon be easier than ever to prevent anyone from accessing data you want to keep private. A new company called Ionic Security plans to offer a service that lets you encrypt and decrypt specific files and online information with a single click. Forbes' Thomas Fox-Brewster reports on the company's plans in a January 21, 2015, article.
As with any technology, Ionic's encryption on demand could be used to lock down information that open-systems advocates believe should remain public. That's why the company claims to maintain a black list of countries and firms it refuses to work with.
The Electronic Frontier Foundation is promoting the Let's Encrypt program that is designed to encrypt the entire web. (In a related matter, a group of Harvard professors has expressed their opposition to a plan by U.K. Prime Minister David Cameron to outlaw encryption, as the Financial Times' John Gapper describes in a January 22, 2015, article.)
It's probably too soon to consider user-side encryption a practical alternative for everyday computer use, but at least it's starting to appear on people's radar screens. In a year, maybe two, the encrypt/decrypt button may be as common as your browser's back and reload-page buttons are now.
Judges and 'dark money' campaign contributions
In an October 2014 article, Mother Jones reported on the seven-fold increase in contributions by special-interest and partisan groups to state judicial campaigns between 2000 and 2011-2012. As the magazine points out, the $288 million spent on judicial races since 2000 dwarfs the $17 billion spent since that year on Congressional campaigns. However, it's difficult to track contributions to judicial campaigns because of "weak" state disclosure laws, according to the publication.
In one instance, the CEO of an energy company in West Virginia contributed $3 million to the campaign of a state Supreme Court justice, who just happened to cast the deciding vote that overturned a $50 million judgment against the company. Not a bad investment!
The article quotes retired U.S. Supreme Court Justice Sandra Day O'Connor, who asserted that the mere perception that judicial decisions are biased is sufficient to destroy support for the courts. I'd go even further and say that all campaign contributions are a form of bribery -- directly or indirectly. I readily acknowledge that I hold a minority opinion on this matter. For now.
That Mustang roar is faked? Say it ain't so, Henry!
One of the benefits of more fuel-efficient automobile engines is less engine noise. For most car models, this is a good thing. For Ford Mustangs and other muscle cars, the lack of the familiar rumble could be a deal-breaker.
Drew Harwell writes in a January 21, 2015, article in the Washington Post that Ford, BMW, Volkswagen, and other automakers are using fake engine noise to make their vehicles sound more appealing to consumers.
For example, the 2015 Ford Mustang EcoBoost features an "Active Noise Control" system that amplifies the engine noise through the car's speakers. A similar system is reportedly used in the six-cylinder model of the F-150 pickup truck, although Ford isn't talking about the noise enhancements, according to Harwell.
Other audio enhancers include Volkswagen's Soundaktor speaker in the GTI and Beetle Turbo, and Porsche's "sound symposer" noise-boosting tubes. Since electric cars run nearly silently, adding the equivalent of engine noise becomes a safety issue. Federal agencies are expected to release regulations later this year requiring hybrid and electric cars to play fake engine noise to alert pedestrians, bicyclists, and other drivers.
Several car experts state that while the audio boosts don't affect performance and may enhance the driving experience for some customers, the carmakers should simply admit to the fakery. As Kelly Blue Book's Karl Brauer is quoted as saying, the companies "should stop the lies and get real with drivers."
Imagine if dealing honestly with customers caught on with other companies. "You may say I'm a dreamer...."