One thing I was wondering is how Apple is even able to create a backdoor. It is explained toward the end:
"The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer."
This is actually quite reassuring - what this means is that even Apple can't break into an iPhone with a secure passphrase (10+ characters) and disabled Touch ID - which is hackable with a bit of effort to get your fingerprint.
As much as I value privacy I really don't agree with Apple's stance here - if due legal process has been followed, why shouldn't they be able to read the contents of an iPhone ?
And yes I get that third party encryption can be used, which isn't owned by Apple and that there's little the authorities could do about it - but that's not the case at hand here.
What Apple needs to do then instead of writing this letter, is release an update that closes this backdoor.
I bet hardware vendors are just salivating at the concept of having to produce thousands of iPhone cracking docking stations.
I am quite disappointed that the us courts are trying to force apple todo this, and in my opinion, its just to use this case to set a precedent.
I hope Apple cant get it to work, but id hate to see what the courts would do if that happened.
- lightning cable delivered iOS patch (probably won't work because iOS won't negotiate over USB until you tap a dialog box)
- OTA update (not connected to internet)
- Cracking open the device and accessing the storage directly (encrypted until boot time)
The most likely vector I can think of:
- - Lightning cable delivered iOS patch from a trusted computer (i.e one that the terrorists actually owned)
It's quite impressive that Apple is taking a stand like this, though perhaps unfortunate timing WRT the larger encryption debate.
"Now that the show is over, and we have jointly exercised our constitutional rights, we would like to leave you with one very important thought: Some time in the future, you may have the opportunity to serve as a juror in a censorship case or a so-called obscenity case. It would be wise to remember that the same people who would stop you from listening to Boards of Canada may be back next year to complain about a book, or even a TV program. If you can be told what you can see or read, then it follows that you can be told what to say or think. Defend your constitutionally protected rights - no one else will do it for you. Thank you."
They felt the need to state that, huh?
However, the iPhone of the attacker is an iPhone 5C, which does not have Touch ID or a Secure Enclave. This means that the time between passcode unlock attempts is not enforced by the cryptographic coprocessor. More generally, there's no software integrity protection, and the encryption key is relatively weak (since it is only based on the user's passcode).
The amount of work needed to turn security into good user experience is phenomenal: https://www.apple.com/business/docs/iOS_Security_Guide.pdf
But that they have the capability is a bit scary.
I wish more companies could speak so clearly and courageously.
This is actually the result of a barter. The Gov gets to have some low level TOP-SECRET access in trade for this easy access code and that Apple gets to go public to keep the populace calm and pretend they are fighting this thing.
If they put a backdoor in iPhone for US government, they are effectively thrown out of Chinese market.
Interesting enough, what will Apple do if Chinese government demand they to decrypt/put backdoor in exchange of staying in the market?
Could someone answer a question I have though? The government wants Apple to create this backdoor and tailor it to the specific device, so presumably it will have a line that goes
if (!deviceID.equals("san_b_device_id"))
return;
To make the backdoor general purpose, this line would need to be removed. But doing so would invalidate the signature and it can't be resigned afterwards because the attacker won't have Apple's signing key. So is the open letter a matter of principle that they won't build any backdoor, now or in the future, rather than a specific concern about this backdoor?
They aren't talking about putting a back door into systems to be used in the future, they are saying it's indeed feasible to place a backdoor on a device already out there and then use the backdoor to access the device. That means the device is not actually secure.
(Edit: deleted part where I was wrong. Thanks robbiet480 for correcting me. It's 2am here and I was tired.)
Also, prediction: if Apple refuses to build a brute forcer, someone else will do it and sell it to the FBI. Just wait and watch.
"Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession."
Am I reading this right? Apple, if they chose to, can make a version of iOS that disables security features and encryption and load it onto existing phone even though the phone is locked and encrypted?
It's not about giving props: Apple is not doing this out of goodwill, or because they believe in protecting privacy. Apple has a competitive advantage against Google/Facebook in that its business model does not depend on violating their customer's privacy.
They are just exploiting that competitive advantage.
Cfr. https://ar.al/notes/apple-vs-google-on-privacy-a-tale-of-abs...
It's entirely possible, that the FBI can then use this precedent to simply have Apple remove all security from an iPhone in pursuit of an active investigation, which can be done with a straightforward firmware update - which IOS users tend to do without much thought.
/quote: "RIM's carefully worded statements about BlackBerry security are designed to make their customers feel better, while giving the company ample room to screw them." /endquote
I have lost enough points on this thread to simply double down on this issue.
This is not a good sign at all. While Google can't compete with Apple on the principle of "not spying on their users". All Apple has to to is to publicize it and then ask for forgiveness from it's users later.
0) Find some errata. Apple presumably knows as much as anyone except NSA. Have plausible deniability/parallel construction.
1) OS level issues, glitching, etc. if the device is powered on (likely not the case). Power stuff seems like a particularly profitable attack on these devices.
2) Get Apple, using their special Apple key, to run a special ramdisk to run "decrypt" without the "10 tries" limit. Still limited by the ~80ms compute time in hardware for each try.
(vs. an iPhone 5S/6/6S with the Secure Enclave:)
3) Using fairly standard hardware QA/test things (at least chip-level shops; there are tens/hundreds in the world who can do this), extract the hardware key. Run a massively parallel cluster to brute force a bunch of passphrases and this hw key, in parallel. I'd bet the jihadizen is using a shortish weak passphrase, but we can do 8-10 character passphrases, too. They may have info about his other passphrases from other sources which could be useful.
While I'm morally against the existence of #3, I'm enough of a horrible person, as well as interested in the technical challenge of #3, that I'd be willing to do it for $25mm, as long as I got to do it openly and retained ownership. In secret one-off, $100mm. I'd then spend most of the profits on building a system which I couldn't break in this way.
[0] https://en.wikipedia.org/wiki/File:PRISM_Collection_Details....
I doubt the ones giving these orders would be comfortable with their own privacy being at risk.
That letter might be the truth or could be some kind of decoy. Maybe the backdoor will come and Apple knows that already and they try to limit the damage to their brand.
Like "we tried to resist having a backdoor installed, but we couldn't do it ultimately".
Is it not possible for law enforcement to get what they want from that, if all they want is a custom build of iOS that can be hacked around? And why is it even possible for that to work if the data is supposed to be kept secure?
What do find interesting, is that Apple isn't the first manufacturer that the government as ordered to crack a device. An "unnamed smartphone manufacturer" was ordered to crack the lock screen on October 31, 2014.[1] No one made a fuss then, so someone caved.
• Can Apple OTA upgrade iOS when the device is locked?
FBI: "You've built a device that makes it nation-state-difficult to install custom software without DRM keys. We'd like you to assist us in deploying software signed with your keys."
Apple: "That feels way too much like asking a CA to sign a cert for you, so fuck off."
I'm honestly not sure which side I'm on here.
They really need to put that paragraph closer to this one:
> The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.
The first paragraph without the second implies that iOS isn't actually secure at all.
In the UK, laws originally intended for surveilling terrorists were/are routinely used by local councils (similar to districts I think) to monitor whether citizens are putting the correct rubbish/recycling into the correct bin. [1]
This is a pandora's box, and the correct answer is not to debate whether we should open it just this once, it's to encase it in lead and throw it into the nearest volcano. Good on Apple for "wasting" shareholders money and standing up for this.
[1] http://www.telegraph.co.uk/news/uknews/3333366/Half-of-counc... - and lest the source be questioned, this is one of the more reactionary newspapers in the UK.
Someone who believes in conspiracy theories would make a statement that "now it is official" :)
We were shocked and outraged by the deadly act of terrorism in San Bernardino last December. We mourn the loss of life and want justice for all those whose lives were affected. The FBI asked us for help in the days following the attack, and we have worked hard to support the government’s efforts to solve this horrible crime. We have no sympathy for terrorists.
When the FBI has requested data that’s in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we’ve offered our best ideas on a number of investigative options at their disposal.
We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no an way to guarantee such control
This is just pure awful they admit to helping the fbi. how can we trust them
There can be no compromise because China, Syria and Turkey would also lean on Apple to break into phones of dissidents, and pretty soon, future whistleblowers here in US too in order to prevent leaks (iPhone 7 and iCar notwithstanding).
That's the tradeoff in not giving in to faint, vague "maybes" that there were "external coordination" when in all likihood it was the ultraconservative, Saudi half leading this duo into the kookooland of violent extremism.
The security services will just have to buy exploits, develop malware, cultivate human intelligence sources and monitor everything the old-fashioned way... It's not like that kid in a YouTube video finding a jailbreak exploit for an iPhone and not releasing a tool is going to sit on it, he's going to auction it off to the shop or country with the most $$$.
https://assets.documentcloud.org/documents/2714005/SB-Shoote...
It is a PDF.
Even with the restriction of being plugged in, outside of Apple who needs to push iOS versions at tethered devices and will be hindered too badly by having to unlock them first?
Will they, can they do anything about data in iCloud as well? While you can turn off iCloud I'd guess the majority of people are using it. Given you can access much of it at iCloud.com that would seem like whether or not you can unlock an iPhone most customers' data is available directly from Apple. Mail, notes, messages, photos, etc. No idea about other apps data that get backuped
Again I'm applauding Apple for standing up for encryption. If they could some how offer the same on iCloud I'd switch from Google (Google is not encrypted either. My point is I'd switch to a service that offers it)
I do believe there is no backdoor for when a city court requests it, but i don't really believe that the FBI or CIA doesn't have access to it.
Considering that iPhone already exists a long time, they must have some means to backdoor the "iCloud"...
Oh wait they already did by providing their clients' data. Trying to stop the government now is like trying to stop a high-speed train. Still, good luck to them! Good to know they are not just pushed around without any resistance.
While we believe the FBI’s intentions are good, it would
be wrong for the government to force us to build a
backdoor into our products. And ultimately, we fear that
this demand would undermine the very freedoms and liberty
our government is meant to protect.
Tim Cook
Kudos to this guy for standing up to an idea.Now on practical notes, this is about security, providing a digitally secure platform to both users and providers, prevent tampering, keeping data secure.
Microsoft could take a cue.
The problem with this is that no such tool should be possible to build. It should not be a matter of yes or no; it should be simply impossible for Apple to build such a tool without the private key of the user, which Apple does not have.
If it is possible to write a piece of software which can circumvent the protections of the iPhone without the user's private key, then Apple wrote its security software incorrectly. Either they wrote it with an appalling lack of security understanding; or they left in important backdoors, either knowingly or through ignorance. But if they wrote the software correctly and did not create backdoors of which they're aware, then the government's request is actually impossible -- cannot be done.
So which is it, Apple? Is the point moot because you did this right? Or have you already placed backdoors in the product which the FBI is now asking you to exploit for their benefit?
Right now it all hinges on Apple's private key and that's a very thin wire to hang all this privacy off.
I'm just a government relations guy, not a security person, so please forgive me, but I'm not sure where I fall on this. I want the FBI to be able to decrypt the San Bernardino attackers phone. The same time, I don't want the government to be able to decrypt my phone. This is one hell of a damned if you do, damned if you don't situation, and I'm really stuck.
probably they try to fight this request by arguing that the government is actually asking them to effectively remove security from all the phones (of this model at least). they would be happy to help break this one phone as long as it doesn't affect any other phone.
in that case, then Apple should just break the phone and give it back to the FBI after removing the backdoor.
People hyperventilating that the tool could be used to crack other phones can relax, given the last clause in the quoted text (from the actual order).
What's at stake for Apple is not only their principles but also one of their marketing pillar: "you, the user, can trust us with your data/privacy." By asking Apple to give that up, and quietly, you actually are asking them to undermine their business model. Shareholders will not appreciate that if they wouldn't have a chance to hear about it first. The Apple brand would lose from its value and it would reflect in the AAPL share price.
My point is that the whole thing needs to have legal backup. And Apple is asking for this exact thing: give me a law to use. And not something from the 1700's.
edit:
Ah, I've found a couple of sources claiming that the secure enclave wipes its keys if its firmware is updated. Makes sense.
My guess is that this is more about pushing back the law and peoples rights than is is about getting access to this device.
But then I'm highly cynical about what the government claim they can do with technology for obvious reasons.
Real data security has to be a mix of services that are friendly to reliable key exchange and strong unbreakable encryption, and verifiably secure endpoint software, which in practice means open source software where the user can control installation, that implements encryption.
It remains to be seen, though, what Apple will actually do, in legal terms. Will they flat-out refuse to cooperate, even if this means that they will be fined or Mr. Cook will be imprisoned for contempt or something like that? Will they actually send their lawyers to challenge the court decision? That would be very interesting to watch, and if they succeeded, it would create a precedent for a lot of other companies. But so would their failure.
But that would take more balls than anyone left here in this "Land of the free and home of the brave" seems to have left anymore.
Apple is selling devices on the whole planet, not just in the USA. So, what's the FBI (an American agency) is requesting is not dangerous for only American citizen, but also for iPhones' owners in Europe, Asia, Africa, Oceania. Hell, these people are not even part of the debate, because they don't belong in the "American democracy".
If I'm going to be affected by someone else's policies, I would like to be at least allowed in the discussion.
They may already have this in place now, but what we are seeing now is a show. They are testing how people/consumers are going to react to this situation. Out government probably figures that nobody will care in the end.
In the USA, we have lost our liberty. It's time to wake up and see what is happening. It's getting worse & the people within our government are working hard to enslave us even more.
The philosophy of corruption and oppression still echoes throughout the FBI. Even today, there are FBI agents that work for private interests. You can't reform a mafia, you must abolish it and start over.
It would be relatively easy for the chip to offer a challenge and accept, say, a $100,000 proof of work to unlock the phone. This way, we prevent bulk surveillance but still allow the government to access high value targets' devices.
This particular hill that Tim Cook has decided to defend is as important as anything Steve Jobs ever did at Apple.
I don't get it- the shooters are dead. How is what is on their phone a matter of national security? We probably have 99% of the information we'll ever have on them. There is no larger plot. Not having what's on this device I cannot imagine puts anyone at risk.
Win/Win
No software backdoor is created, the FBI gets its data and we all go on with our lives. Why are we spending so much time gnashing teeth over something that has a very simple solution to it?
Effectively, the government is forcing Apple to take receipt of a device that it does not own or posses, then perform potentially destructive services on a device, and then perform services that could potentially require Apple to testify at a trial under the Confrontation Clause of the Sixth Amendment.
I really think that Apple's in the clear here, and the AUSA's in the case are pulling all the stops to get Apple ordered to break the encryption.
"Today we celebrate the first glorious anniversary of the Information Purification Directives.
[Apple's hammer-thrower enters, pursued by storm troopers.]
We have created for the first time in all history a garden of pure ideology, where each worker may bloom, secure from the pests of any contradictory true thoughts.
Our Unification of Thoughts is more powerful a weapon than any fleet or army on earth.
We are one people, with one will, one resolve, one cause.
Our enemies shall talk themselves to death and we will bury them with their own confusion.
[Hammer is thrown at the screen]
We shall prevail!
[Boom!]
On January 24th Apple Computer will introduce Macintosh. And you'll see why 1984 won't be like '1984.'"
------------------------------------------------------------
Apple Superbowl AD "1984"
Transcription courtesy of George Gollin, 1997
Edit:Removed the link to the video. My goal wasn't to draw traffic anywhere it was just to point out that some of Big Brother sentences in an Ad aired 30 years ago still have strong resonance today.
"Our enemies shall talk themselves to death" Hum... just read yesterday that NSA is believed to use machine learning over cell big-data to determine drone target...
https://hn.algolia.com/?q=&query=&sort=byPopularity&prefix&p...
Can a private, for profit, company deny the will of an elected government working to solve a heinous crime based not on what they say they will do but because they cannot give a 100% guarantee that this is the only time/way it'll be used? Apple acknowledges that the government is saying it's limited to this case but because there's no guarantee (100% certainty) they feel they can deny it?
If yes, what does that mean as a broader precedent. Are we comfortable with private companies denying an elected government based not on what they agree to, but instead because there's a chance it'll be used in other ways?
As terribly flawed one might feel about government very few would think it has less accountability than a private company.
I can understand someone outside of tech not understanding how those are comparable statements, but if anything the latter is more important.
I've always wondered why large tech companies/corporations abide by such orders instead of speaking out. Even if Apple was under a gag order, they've created a PR nightmare for the alphabet agencies; Apple could be pursued in court, but that pursuit would now likely be done in the face of negative public opinion.
Apple built hardware which was not particularly secure. The software defaults to a four-digit PIN. They attempt to mitigate this by adding an escalating interval between entries, and by optionally wiping the phone after too many failed tries, but this is not set in stone and those limits can be removed with a software update.
The government is coming to Apple and saying, "You can remove these limits. Do that for us on this phone." Coming as a legitimate court order, I see no problem with this request. The government isn't even asking them to crack the phone, they just want Apple to remove the limits so the government can try to brute force it. They're even paying Apple for their trouble.
If Apple didn't want to be put in a position where the government can ask them to compromise their users' privacy, they should have built hardware which even they couldn't crack. And of course they did; starting with the A7 CPUs, the "secure enclave" system prevents even Apple from bypassing these limits. The phone in question just happens to predate that change.
If the government was demanding that Apple do the impossible, I'd be on their side. If the government was demanding that Apple stop manufacturing secure phones, I'd be on their side. But here, all they're asking is for a bit of help to crack an insecure system. They're doing this in the open, with a court order. What's the problem?
Massive fines? (we know they have the cash to cover it)
Jail time for execs (whoa!)
?
And Edward Snowden just tweeted this a few minutes ago in response to another tweet proposing Google back up Tim Cook: "This is the most important tech case in a decade. Silence means @google picked a side, but it's not the public's."
That way for future phones at least, the issue would become moot: there would be no way for Apple to build and/or install a custom software image that allows brute-force password cracking.
The government would never have access to a phone with a compromised version of an OS that they could use to repeat this trick. Rather, the government would have to obtain court orders and have forensics done under supervision.
This isn't a backdoor and doesn't affect consumers, and sets a really high bar to trying to scale this for the government because it requires Apple as the gatekeeper every time to agree to do the one-off hack.
The cynic in me thinks that this letter is more about brand image. Apple wants to claim they can't hack their own phones, even if the government asks, but clearly in the case of the iPhone 5C it IS possible for them to do it, and this creates a contradiction with their public marketing and privacy position. If they didn't release this open letter, then simply complying with the judges order would make them look bad.
If the device were truely locked down, there would be no aftermarket solution to unlock it.
My understanding is that Apple was asked to supply software that would prevent their software from destroying evidence on a particular device. They should comply with this order, especially given the device in question.
I see this as just another "its for the children" ploy, of which I'm completely sick of.
In that I fully support Apple/etc for finally gaining a backbone. If more people stood up, then I wouldn't have to be naked body scanned at the airport, or the dozens of other privacy invasions the government performs on a daily basis simply to give themselves something to do. So, rather than admit they won't ever be able to predict or protect the population in any meaningful way from random people willing to give their lives to make a statement, they waste our time and money coming up with ever more invasive ways to peek into everyone's most private possessions.
By publicly committing Apple to this cause, Cook makes it more likely that internal teams at Apple as well as future versions of the company will adhere to this position. By defining a set of actions which, if made public, would ruin the company's brand, Cook makes it less likely Apple will take those actions.
Is that true? What if it's locked with a secure 128-bit (e.g. 10-word diceware) passphrase?
Good one Tim! I mean how long did the LE think they can abuse constitution, put spy devices on people's cars without warrant, use stingrays and do all sort of other crazy stuff including planning and executing white-flag attacks without any consequences whatsoever?? I mean, at some point, we the people - for a good reason - will lose all and any trust we have in them! And that's what Tim is saying in this one sentence that with overwhelming evidence, the US Gov would have hard time arguing against!
What parts of the government is a different matter.
This is a perfect setup. Get all the bad guys to run out and buy iPhones (good for Apple) believing that they are safe from the US surveillance machine.
Then the appropriate agency can slurp up whatever it wants.
The FBI doesn't need the modified iOS code, and that Apple write/not-write it doesn't change anything in the end, since someone else could just as well write the software with some reverse engineering.
[edit: if you downvote because I'm wrong, please explain because I'd love to know why]
A) Apple has created unbreakable security. The FBI cannot access the data and needs Apple's help.
B) iPhone security, like all other security, is breakable. iPhones are a very high-value target (all data on all iPhones); therefore some national security organizations, probably many of them in many countries, have developed exploits. The FBI, following normal practice, does not want to reveal the exploits or capability and therefore must go through this charade.
This stance against the government come poetry reaffirms my faith in the genuineness of Apple'e encryption efforts and Tim Cook specifically.
well, as far as I can see, it is not agreeable to the use and principle of law to force a company (or a person since corporations are people) to spend money and waste resources to compromise its own security systems, which happens to be something they morally object to.
So there is already a backdoor. Apple are refusing to let the FBI use the backdoor.
The backdoor is the fact that Apple can push a firmware update that disables security features, without the device's password being entered at any point.
The scary part here is that the iPhone data is really not that secure. If apple can overwrite the OS and get access to the data, this means the keys are stored on the phone somewhere, and not password protected, or "fingerprint" protected.
I would agree with Apple if they wanted FBI to pre-submit all their guessed passcodes for brute force for apple to try, and for apple to have the sole responsibility for that, so that getting said "backdoor" (which really is nothing more than a door handle) will be as hard as getting their private keys, and governments will not keep the said backdoor in their hands. I would also agree if Apple claimed they don't want to be able to crack devices at a judge's order (although that would be against the law - so they can't claim that).
But this is NOT what Apple said. This whole letter is just one big PR bulshit. They CAN brute force a passcode. They failed enforcing significant delay incurred when failing a passcode attempt - even tho this issue was already known for YEARS (will give citation if needed) when apple designed the discussed iPhone 5C - and they also failed requiring passcode to update the device. They already have their convenient backdoor in place in the form of their private keys.
The supreme court has ruled in separate cases that: 1. that software is speech 2. that a person (corporations are people according to them) cannot be compelled to speak
It would seem to me that the FBI could perhaps subpoena technical documentation from Apple but it should be required to hire their own developers to write this software.
http://techcrunch.com/2014/02/27/apple-explains-exactly-how-... (Link to Apple's paper is in the article)
(Yes, Apple could add this key for everybody at the beginning, but if their intention is security then it is a brilliant system.)
But it only works on iOS 8.1 or earlier, was patched in iOS 8.1.1
Sounds just like gun control :)
This is a very clearly political refusal. Apple is saying that about as explicitly as they can in this message. Whether or not they can do it, Apple doesn't want to be caught in the game of being a government surrogate or having to determine for themselves if government requests are legitimate (imagine, say, if the Chinese government asked for data from a dissident's phone - would Apple want to risk that market by denying a request that they have complied with in the US?). It's unfortunate for them that the FBI is making this request while people still own phones like the 5c for which they could theoretically disable security features, as opposed to the newer phones which it is possible they are completely unable to defeat.
If they cannot co-exist, I'd rather have more security and less privacy. But ideally, I shouldn't have to choose between them.
Mr. Cook expressed concern that "the government could intercept your messages, access your health records or financial data, track your location, or even access your phone's microphone or camera without your knowledge".
As I read this I wondered, "what harm would actually happen if that occurred"? If the government did read my messages and get my health records & financial data and track my whereabouts, I can't think of anything bad that would actually happen as a result of that.
Is there anything specific that I should be worried about in that scenario?
I wonder if this is a grammar mistake, or Apple actually considers the private conversations, nodes, photos to be theirs?
* Does Apple pretend the FBI cannot access to its devices?
* Can the FBI access to its devices?
The only thing we learn here is the answer to the first question. We know nothing more for the second one.
So am I missing something that makes the iPhone's internal security architecture relevant here?
To honour Tim, and his advocacy for our industry, I'm going to spend the rest of my week developing privacy/security projects. I encourage everyone else to do likewise.
The court order gives Apple an out: "To the extent that Apple believes that compliance with this Order would be unreasonably burdensome, it may make an application to this Court for relief".
Now, imagine if this was court ordering a company to engage in unethical medical procedures, rather than unethical software development. The professional medical community would sanction doctors that cooperated and support those that stood by their ethical principles and refused to cooperate. If there was a similar professional organization for software development, Apple could reasonably rebut that telling their engineers to work on this would be unreasonably expensive (since they'd expect to fire people or have them resign over it).
This is another avenue for fighting the order - have a good chunk of Apple's engineering department sign an open letter saying that they'd resign before working on that project. The incentives seem like they'd work for making it a thing.
Apple could propose to secure access to the FBI using the same level of security that it uses to protect the access to the phone content for the owner of the phone himself. Tim Cook only talks about one solution of a "tool" that it could install.
If the same level (and method) of security is used then saying that there is a risk of the backdoor being hacked would be equivalent to saying that there is a similar risk of the user access being hacked.
(guess answer: iOS needs to be signed. So what they are really asking of Apple is to sign a lobotomized iOS image...)
I am now officially, an Apple fanboy. That's right, I'm gloating to family and friends, about how Apple is standing up to the man, doing the right thing, and refusing to compromise their security.
Keep up the good fight.
https://www.whitehouse.gov/contact
Here is my letter to them:
Dear President Obama,
I've voted for you in both elections, and have been a firm supporter on all your causes (affordable care act, and more). However, your FBI has clearly overstepped it's authority by demanding that Apple spend engineering resources building a software product that can break the encryption of a terrorist's iPhone.
Seriously, you need to stop this. You are the head of the executive branch of the government, of which the FBI is directly underneath your jurisdiction. Director James Comey is directly within your chain of command.
What the FBI is asking for is a master key to be created that can decrypt any iPhone. This makes all Americans with Apple devices insecure in the face of threats to our personal security and privacy. I hope you can understand that this is clearly unacceptable, and needs to be stopped.
I want to register my complete opposition to the FBI in this circumstance. Please stop this.
Thanks, xxxx
If Apple cared customer's privacy and security so much, how could they sell non-free software that is hard to audit, computer with baseband processor, relies on central server which allows the single point of failure.
My understanding is Apple customer don't much care about their own privacy and security but has weakness on marketing.
1)http://timesofindia.indiatimes.com/tech/tech-news/telecom/Go...
2) http://www.reuters.com/article/us-blackberry-saudi-idUSTRE67...
3) http://www.thestar.com/business/2010/08/16/threats_of_blackb...
Thank you!
Comments: