662 Apple, you're pissing me off




Apple, you're pissing me off



And the more I think about this issue, the more pissed off I get (why? Just listen to Sam Harris at the end of this blog). Now you’re taking to the Senate your battle with the FBI for trying to make you give them access to the content of a terrorist’s iPhone; and apparently Tim Cook is prepared to take the issue to the Supreme Court.


Update: Obama cautions against total encryption. Time. Fortune.



In another case this week a New York court also rejected a US government request to force Apple to help it extract data from a locked iPhone (The Intercept.com):


In a formulation extremely favorable to Apple, the judge wrote that the key question raised by the government’s request is whether the AWA allows a court “to compel Apple - a private party with no alleged involvement in … criminal activity - to perform work for the government against its will.”


The ruling from U.S. Magistrate Judge James Orenstein was issued as part of the criminal case against Jun Feng, who pleaded guilty in October to drug charges. It is a significant boost to Apple’s well-publicized campaign to resist the FBI’s similar efforts in the case of the San Bernardino killers.


This sux: The obsessive concern by privacy advocates to support tech companies (assumedly not just Apple ... it's just that they are the very bright tip of the iceberg) with standing in the way of law enforcement agencies (not political organisations - and the difference is to be established in every single case by a court) doing their job. Methinks this new "Privacy At All Cost" cult (S. Harris) has gone off the rails.


Update: (SMH) ... also watch that snippet of an interview with Tim Cook [mate, for the first time I don't like you, I really don't like you], where he professes to be concerned about hundreds of millions of customers being made vulnerable and where he apparently reckons the idea is "bad for America" ... however, where he is patently just concerned about the fortunes of Apple. Anyway, this from the newspaper article:


"The FBI and other law enforcement agencies are increasingly concerned about how encryption affects their ability to access evidence that's stored on devices.


The bureau argues if Apple succeeds in its appeal, it will create a precedent for law enforcement where authorities may have a warrant to search a device but are unable to do so.


A survey from the Pew Research Centre found more Americans think Apple should comply with the court order than not. And a global survey from CIGI-Ipsos shows most people favour giving law enforcement access to private online conversations if they're conducting criminal investigations or protecting national security interests."


Let's nail the issue: The FBI requests "assistance with getting access to the content of a criminal's phone" because they think it will help solve a homicide ... apparently there are about one hundred smart phonesheld by the FBI that belonged to a victim, where they have reason to assume it contains footage of the killer.


Judge Sheri Pym did not order Apple to break the encryption on the iPhone. Instead, she asked the company to develop a new version of the iPhone’s iOS operating system that would allow the FBI to use its computers to guess the passcode quickly, without getting locked out for making too many guesses. This approach, sometimes referred to as a “brute force attack,” circumvents the iPhone’s encryption without actually breaking it. (the Intercept.com)


This is the thing: The FBI is not asking Apple to release any software they could use to open locked iPhones ... anyway, Apple says that sort of software does not exist; but they then say to write such software and make it available to law enforcement agencies would put the privacy/security of all iPhone users at risk. Fair enough.


But the FBI apparently are not asking for that to happen at all ... all they want is to be given the content of the iPhone in question. Now, why can't Apple (and Samsung, and Sony, and Nokia, and ...) create a super-safe division within their "campus", where they control every aspect of privacy & security? And only the information required to solve serious crimes leaves their premises; only after a court investigated the case ... I know, it'll be costly; who'll pay? The agency requesting the info, of course.


The way I see it, Apple is super-concerned to keep up the appearance that they are the epitome of a company hell-bent on keeping our data safe ... which they consider a big selling point. Tell you what, Apple ... to me, it would be a great selling point if you solved this problem by helping the FBI and keeping the software you have to create to override the password encryption safe within your walls; this must be possible ... I mean, who are you? The greatest tech company in the world, or something?


I can't help thinking: If a tech company like Apple, and others like Microsoft, are unable to comply for technical reasons - they say if they did write the software they would not be able to guarantee it would not land in the hands of hackers - then this stance doesn't show a well developed sense for providing cyber security but rather technical incompetence (and mind-bending recalcitrance). Frankly, Apple, your stance is scaremongering and irresponsible.


A note to iPhone users: If you become murderers or terrorists, or if you think of becoming a terrorist, you forfeit your right to privacy. Tough luck. Apple, get real.


I have blogged about this issue before (blog 654) and again I leave the finer points to Sam Harris (listen to it ... and be patient, it's just the first 7 minutes):


“If you think the authorities - like the FBI, the NSA, the Supreme Court, the state (the government in general) - are the enemy, then that attitude is toxic ... that is pure paranoia, dogmatism and a recipe for anarchy … (if you think) the authorities, our elected representatives, should never - no matter what - have access to our private information, you are a child


"... some people in order to protect their sacred safe space inside their smart phone, are willing to extend perfect privacy to known members of Al-Qaeda and IS … if you’re one of them, you have made yourself irrelevant to the crucial project of maintaining the integrity of an open society like our own …





Update ... a bit more feedback in another "house-keeping" item on Harris' next podcast; listen to the first six minutes; there is a case being made in favour of end-to-end encryption ...


... where cyber security is the most important area of our vulnerability and the vulnerability of open-society in general and which ultimately (supposedly) is the only thing to make us safe from terrorism, enemy states and IS like activity … even with the liability that you wouldn’t get into the smart phone of a terrorist.