New working hypothesis. If correct, the entire concept of market incentives for privacy needs to be substantially altered. Not that creating such a market is impossible er se, but the belief that disclosure and consumers can "vote with their feet" is provably invalid because the rational consumer -- like the rational car buyer -- must assume that the device is or will become a "privacy lemon" rather than a "privacy creampuff."
Not gonna bother to unpack that unless someone is actually interested. But I need to park it somewhere or two weeks from now I'm going to be looking for the stupid scrap of paper.
Not gonna bother to unpack that unless someone is actually interested. But I need to park it somewhere or two weeks from now I'm going to be looking for the stupid scrap of paper.
no subject
Date: 2018-02-08 07:40 pm (UTC)no subject
Date: 2018-02-08 09:23 pm (UTC)Unless you simply never had a credit card, bought a house, or did anything else that ever required a credit check, for example, nothing you ever did or didn't do protected you from the Equifax hack. Unless you are one of the few people in the United States that does not own a mobile phone -- or whose mobile phone is older than (I believe) 2005 (the date on which the GPS mandate went into effect) -- you have exposed information on your geolocations. If you have purchased a car within the last 5 years, you have shared all information on your driving habits with the automobile manufacturer -- which it turns out is absolutely awful at information security.
That you believe that your activities genuinely shield your digital privacy (rather than simply make it somewhat harder/less likely it will be violated) because of technical expertise, believe that anyone could achieve similar results with technical expertise if they would just be willing to put in the effort and that therefore you are among the elect and the rest are privacy sinners condemned for their own moral failing is actually covered by a different Akerlof paper: "The Economic Consequences of Cognitive Dissonance." https://www.iei.liu.se/nek/730A17/artiklar/1.284974/Akerlof-Dickens.pdf
But it's about as useful for policy as the belief that Cholera is caused by moral degeneracy -- as demonstrated by the fact that the victims are usually those yucky, sinful poor people who live in the worst and most unclean part of the city. See, cholera neatly corresponds with moral degeneracy. QED. Who needs to spend money on plumbing. Similarly, if you are reading this through any device your privacy is, to some degree, at risk. Your electronic footprint is, at least to some degree, visible. But just as we can cure cholera with enough prayer and virtuous thought, i'm sure we can protect digital privacy by just trying hard enough and believing.
no subject
Date: 2018-02-08 11:05 pm (UTC)I don't think that my privacy is protected because of what I do; I think that certain aspects of my privacy are improved because of what I do. My threat model does not encompass state actors or narrowly targeted attacks because, as you say, that is futile.
An attack against Facebook for identity theft purposes that scrapes the fields they provide for birthday and workplace and other vital statistics will not work against me; I've been lying. But an attack against *me* on Facebook will probably work.
When I give an electronic service an email address, it's one that only they have. An attack that compromises their users will be obvious to me.
When I give them passwords, I don't reuse the passwords. They can't leapfrog an attack from "email + password" on site A to site B in an automated way.
My car doesn't have a net connection in it. Should the next car come with one, I shall disable it -- taking away some measure of convenience in exchange for a measure of privacy.
I can't do anything about cellphone companies turning over tower info to the government, but I can avoid installing apps that report locations to third-parties, turn off location information except when I choose to turn it on, and otherwise forego conveniences in exchange for being lost in the masses. Won't work if I am a person of interest, but it reduces my exposure to large-scale attacks.
There's nothing much I can do about Equifax.
And in the meantime, I donate to the EFF and encourage people to be aware of the octopus that has us all in its clutches.
no subject
Date: 2018-02-09 12:05 pm (UTC)Well, to unpack my work here a bit. I have been migrating into the policy field over the last few years. In Policyland, we still have the raging debate on whether people care about privacy (and how much) based on the fact that people use the Internet anyway. So dspite the fact that survey after survey demonstrates that people really hate the status quo, we get lots of argument that they don't care or are unwilling to make the effort.
My argument is one grounded in well established economic theory that people are simply being rational actors. It is based on Akerlof's seminal economics paper "The Market For Lemons." To summarize- Akerlof examined the used car market. Although everyone would like to buy good used cars ("creampuffs") and avoid bad cars ("lemons") the market failed to produce any reliable mechanism for distinguishing between the two. Why? As Akerlof explained, consumers faced the problem of being unable to verify any dealer claim that a used car was a creampuff rather than a lemon. Accordingly, consumers act as if all cars are lemons. Because consumers act as if all cars are lemons, there is no incentive by any used car dealer to try to develop a way to sell only creampuffs. No matter what the merchant does, it is not rational for the consumer to believe it because the consumer cannot trust the car and every contract has fine print that somehow allows merchants to wiggle out. No matter what I as an individual merchant may do to try to convince you that I am different, you rationally should not believe me, because that is *precisely* what a dishonest used car dealer would say or do.
Akerlof's solution was to find a way for the consumer to trust the vendor. He proposed a "Lemon Law" that would allow a consumer to return the used car within 30 days for any defect. The consumer no longer had to prove that the dealer new about the defect, and there was no way for the dealer to force the user to waive the right to return the car if it proved defective. The result was to radically change the operation of the used car market.
Privacy policy has a similar problem. Market incentives are not a bad idea, but we are going about them all wrong. Because a privacy policy can always be unilaterally changed without notice, because there is no private right of action or private enforceability mechanism, and because the vendor itself may not know of the future privacy "defect," the rational consumer therefore treats all privacy policies as an unenforceable joke. No market for privacy can therefore develop under the current legal regime.
Accordingly, we need to adopt policy measures that address the factors that prevent consumers from being able to trust strong privacy policies if we want market incentives to enhance privacy. A "privacy market" that industry keeps insisting would emerge if people genuinely cared can be proven to be not merely a practical impossibility, but a theoretical possibility.
no subject
Date: 2018-02-09 02:45 pm (UTC)There's a big problem: the industry does not generally know how to build secure systems, where secure is minimally defined as "gives only the appropriate information to the right people, and does not give inappropriate information to anyone". There are special cases which may be secure, but most complex systems which make general security claims are not.
We can improve incentives: for instance, a bank could guarantee that no inadvertent disclosure of your information would put you at risk of more than $100 loss in the same way that unauthorized credit card use tops out at $50. But the fact of the matter is that the bank would not have any particular internal assurance that they were doing things correctly.
New salescritters occasionally contact me in my $work capacity, making assurances about how secure their cloud environments are. I ask them if they are willing to indemnify us for the complete value of loss of information, assuming that the loss is their fault. New salescritters are sure something can be worked out, and bring in their lawyers... who need about fifteen seconds to say no, not a chance. So we don't increase our attackable surface, and remain hunkered down trying to do the right thing.