Apple vs. the FBI

Discussion in 'Off-Topic' started by PurpleTop, Mar 18, 2016.

  1. PurpleTop

    PurpleTop I need me some PIE!

    Anyone else following this? I'm not a fan of apple, at all.... I really dont like apple. But I am 200% on their side in this debacle with the FBI. The FBI wants Apple to make a new version of iOS without the security feature ( erases the phone memory after too many failed pass code attempts), so that they can install it on the san Bernardino terrorists'phone, thus giving them access. In merits it sounds good, they wanna stop the terrorists, but all it really does is set precedence for the same actions to be taken on any phone in question by authorities, thus tearing down the entire reputation apple has with its valued customers. The FBI claims it will only be used once.... But their apprehension of the case and response to apples reluctance only showed their desperation, and it is kinda scary. Now the FBI I resorting to soft demands. Posting from my phone while on the toilet, so I don't have links yet, but I will work on that.

    Basically, the FBI can demand apple turn over its source code and digital signature... Something they have spent YEARS constructing and protecting. At the same time, we aren't allowed to look at the source code of our own voting machines because its considered a "trade secret".

    Ahh, America
     
    Last edited: Mar 18, 2016
  2. BurnPyro

    BurnPyro Forum Royalty

    following it

    outcome will have huge consequences going forward
     
  3. PurpleTop

    PurpleTop I need me some PIE!

    Absolutely, and not just in the USA. One of the most terrifying cases of out lifetime
     
  4. Dagda

    Dagda Forum Royalty

    is apple not able to set up the operation/procedure so that there's just one, they administer and maintain control over it, so on and so forth?

    or is the worry that doing this once might open the way for it to be done again, regardless of circumstance
     
  5. StormChasee

    StormChasee The King of Potatoes

    There is a solution that should be acceptable to the FBI and Apple. Clearly there is probable cause to investigate what's on the phone. Since this is a specific investigation of a particular terrorist attack, Apple should break the security code on that phone and provide the FBI with the info that's on the phone. That way the security code itself is protected while the FBI gets the info it needs to do the investigation. This shouldn't be as big a deal as it is. The FBI doesn't need the security code. They only need the info. There's no excuse for not doing that provided it's technically possible.
     
  6. Sokolov

    Sokolov The One True Cactuar Octopi

    The problem is the following:
    • breaking the encryption on this phone basically requires that the encryption scheme they are currently using (it's not a per phone thing)
    • there is no good way to ensure that the tool made to breach it can be secured
    • compliance sets a precedent, and it's been implied by numerous agencies (for example, New York PD has over 100 phones they want cracked) that they want to use this in other cases
      • statements made by the FBI has implied that this isn't meant to be a one-time thing
    Think of this like the Allies cracking the German Engima code. Once you've figured it out, you can use the same method to code all other messages encrypted in the same manner. It's not really possible to just do it to one phone because it's not something specific to the phone that is keeping the information inaccessible specifically.
     
    Last edited: Mar 18, 2016
    SPiEkY, BurnPyro and PurpleTop like this.
  7. Dagda

    Dagda Forum Royalty

    i guess my question is "why is this bad?"

    i realize that people want their right to privacy, etc etc, but let's say that instead of a phone it was suspected that important data was inside a heavily secured safe (in the aftermath of some large crime or another)

    do we expect the safemakers to not help at all (provided there's a warrant that includes breaking of the safe)? do we expect the safe to be enough by itself to say "no, this is too important"?

    i dunno. if laws can't keep up with digital era tech, i don't really see how this whole "civilization" thing is gonna work out.


    obviously i'm speaking from ignorance on a few points, but i don't think this is a bad thing so long as a proper bar is met before breaking privacy
     
  8. Ohmin

    Ohmin Forum Royalty

    It's the difference between the safe-maker going over and opening that single safe; versus giving them a key to open all the safes that person ever made (and also being unable to ensure that key never got leaked, lost, duplicated, etc.)

    Chain of custody issues are possible here. The FBI needs to ensure (to the courts) that the data wasn't somehow modified by Apple, and Apple needs them to not know how to break into their phones (so observation is tricky).

    However, I do have the feeling that the FBI is using this as an excuse to gain a new tool, and it's likely that something could be worked out to satisfy their needs without giving them a chance at a "key."
     
    Last edited: Mar 18, 2016
  9. PurpleTop

    PurpleTop I need me some PIE!

    Exactly. The safemakers would 't be able to go make a key that works for just that safe, they just have a master key that works for all safes. And even if they ask for it back, who's to say the authorities didn't make a copy of the key first?

    Sok laid it out pretty well. The main issue is this sets a precedent for any government to use the same procedure with no limitations necessarily set.
     
    Geressen likes this.
  10. Dagda

    Dagda Forum Royalty

    that's what drives my stance on this- i feel like it's highly possible apple can do a one-off here if they're given the power over the situation. if the fbi is just not so subtly trying to gain more tools for everyday use, that's a different thing.
     
  11. Ohmin

    Ohmin Forum Royalty

    Sure, but that's also based on the assumption that the tech can do it without there being an unreasonable risk. I don't know enough about Apple's encryption scheme (or encryption in general) to be certain that assumption is safe (I just think it's likely possible because technology is generally pretty nifty like that).

    Provided of course also that the FBI does have a proper warrant to search the phone and they didn't skip some important steps there (like getting a court order).

    Which is my way of saying I'm aware of the situation, but haven't followed the specific details.
     
  12. Geressen

    Geressen Forum Royalty

    okay but I think you are a terrorist... your phone please?

    you too @Dagda , hand it over.

    purplehat and socks get it.
     
  13. Sokolov

    Sokolov The One True Cactuar Octopi

    So, for more details, the FBI isn't specifically asking for master key, per say, or to break the encryption protocols.

    Here's the court order: https://assets.documentcloud.org/documents/2714001/SB-Shooter-Order-Compelling-Apple-Asst-iPhone.pdf

    iOS currently has safeguards against someone trying to access your phone without knowing your passcode.

    The pertinent part of that document more or less reads:
    1. [Apple] will bypass or disable the auto-erase function whether or not it has been enabled;
      • iOS can delete the data on the phone permanently if you enter the passcode wrong too many times, so this would allow more wrong guesses
    2. [Apple] will enable the FBI to submit passcodes to the SUBJECT DEVICE for testing electronically via the physical device port, Bluetooth, Wi-Fi, or other protocol available on the SUBJECT DEVICE; and
      • this would allow the FBI to hook it up to a computer and send passcodes that way (instead of having to enter them manually)
    3. [Apple] will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware.
      • this will allow the FBI to not to be functionally locked out (even if the data isn't erased), as the phone will introduce greater and greater delays between attempts for each wrong attempt
    So what the FBI wants to do is to guess the passcode as many times as they want, and as quickly as they want - i.e. so they can brute force the phone.

    The FBI has requested the above as an iOS update that they will allow Apple to do on their campus with secure keys (so that the FBI cannot, in theory, take that same iOS to other devices).

    It seems likely, from a technological standpoint, that such a version of the OS can, in theory, be kept safe. But it's also scary for something like this to exists, even if it remains "contained" within Apple, particularly for future requests made by the government.

    However, the real problem is that an iOS update alone cannot accomplish this (it will deal with 1 and 2), but the 3rd point is controlled by a hardware piece on the phones that isn't tied to iOS. It basically locks you out over time. Whether or not Apple can break this part safely, and only on this phone is highly questionable.

    ~

    Note that while this isn't a masterkey, this is, in theory, somewhat worse as a precedent and is functionally the same thing, because it means that safeguards against hacking (in this case bruteforce) can be ordered to be removed - and the best lock in the world doesn't help you if you give someone an infinite numbers of keys to use on it.
     
    Last edited: Mar 19, 2016
    SPiEkY, Geressen and Ohmin like this.
  14. StormChasee

    StormChasee The King of Potatoes

    Valid points are being made. The technology WRT the phone itself is beyond me.

    The basic issue here comes down to the point that no one has the right to perfectly hide evidence of a crime any more than government has the right to search you anytime they damn well feel like it. The Constitution is an attempt to protect individual rights while at the same allow the government to investigate when appropriate.

    Based on Sokolov's post it appears to be a reasonable approach to this and Apple should comply.
     
  15. Gaverion

    Gaverion I need me some PIE!

    I actually changed my opinion after watching this



    To TLDR making a "backdoor" makes it easier for scammers/those who would like to access your financial information.
     

Share This Page