One must understand that E2EE is used when you don't trust your service provider to handle your data. In other words, the adversary in your threat model is the service provider - and in this case, Apple. And what good is that encryption, if Apple obviously can do almost anything with your device?
They can remotely wipe apps. They can force-install apps and force updates. It is not too far-fetched to think that they can just remotely copy anything stored on your device to their servers. So, with an adversary that capable, I'm not sure encrypted backups provide a meaningful improvement to security and privacy.
> In other words, the adversary in your threat model is the service provider - and in this case, Apple. And what good is that encryption, if Apple obviously can do almost anything with your device?
The adversary in this threat model isn't the service provider. The adversary is someone attacking the service provider, like a hacker or a government with a warrant, and getting access to Apple's storage of your data.
Now of course it's not impossible for such an adversary to also defeat other systems at Apple and get your data another way, for example by controlling Apple's ability to send over-the-air updates to Apple devices. But I think that is a sufficiently distinct threat that it's not worth dismissing solutions to the first threat. That would be like dismissing the importance of a web server storing passwords salted and hashed, since attackers could just use a totally different attack to bypass the web server's database access control. Another way to illustrate this might be to point out that attackers can physically coerce you to hand over data regardless of any security measures any service provider could possibly make, but that doesn't mean we should dismiss all such security measures.
remember Lavabit [0]? will Apple choose to shut down rather than to comply [1]? if the government comes with a warrant, it will be with a gag order, and they will be compelled to silently update your phone to extract whatever the govt needs over the course of a few months.
What is your actual point here? It feels like we’re just playing a game if hypotheticals that are no longer based in reality.
Sure Apple could update your device to send all your photos unencrypted to them. They could also remotely turn on the mic and spy on all of us. They could also add key word detection to iMessage and flag law enforcement if you text out the wrong words.
I think everyone here understands what Apple could do. Which is why it’s a good thing that signs point to Apple not wanting their customer data. And why Apple refusing government orders that they feel violate their customers is unequivocally a good thing (even if they’re doing it for selfish reasons)
that e2e encryption by a third party does not give you privacy from the US government if that third party can remotely control or update your device and is subject to US laws. it is a direct reply to the assertion made in the GP: "The adversary is someone attacking the service provider, like a hacker or a government with a warrant, and getting access to Apple's storage of your data."
> will Apple choose to shut down rather than to comply
Apple will probably comply, just like I would probably comply rather than go to jail or suffer injury to myself or my loved ones. But I think it's fair to treat that as a distinct threat.
I disagree - the service provider should be considered an adversary and their service - and your tooling - should make it possible to obfuscate every single bit of data and metadata that you store there.
rsync.net is great and I've always appreciated the exposed ZFS capability, even if at this point 3x the cost per gb for a small scale users vs B2 is a lot more painful. Having encryption, including for transfers, also be part of the filesystem (which is open source) is great. Pity but for a small turn of history ZFS didn't become the native FS for Apple. And I think backups in particular is one of the focused completely unambiguous areas where Apple really has behaved in textbook anticompetitive fashion, and they should be required to allow people to point their iOS devices at any 3rd party service (including their own!) they wish that implements the right API (which Apple should have to document and follow themselves).
Still with all that said:
>I disagree - the service provider should be considered an adversary and their service - and your tooling - should make it possible to obfuscate every single bit of data and metadata that you store there.
If you're using Apple devices at this point then I think they do unavoidably form some part of your core trust foundation. With current hardware Apple is everywhere in the stack right down to the CPU level, heck arguably below that since they have a special license with ARM and can implement their own custom extensions. If you really think they're an adversary to the point of doing custom backdoors explicitly going after you, then the hardware just can't be trusted.
It's not unreasonable though to look at both Apple's incentives and the state of American law at least and see distinctions between Apple being compelled (or hacked) to provide something they have passive access to on their side anyway vs being compelled to engage in non-consensual active work and feature development (or having that slipped in and make it into general deployment) on things that necessarily must go out to end user devices. The former is both bog standard warrant/subpoena territory and not inherently detectable outside of Apple and the government, since it doesn't directly involve the user as a party at all. The latter is very arguably illegal and provokes far more public response, and involves deploying in ways that make it far harder to keep concealed (and open up other avenues of challenge).
I don't get it. If you don't trust Apple, then you don't take photos with an iPhone. There is no possible service they could offer that assures you every bit of data and metadata is obfuscated end to end in any sense of before Apple software has a chance to see it. At bare minimum, the camera app has to put together a file before there is anything to encrypt. A malicious Apple could just keep a second copy of that file, and even if you used a different backup service, they'd still have it.
However, as with all things here, you can just email and discuss with a real person and we'll set you up the way you need to be set up wrt billing and pricing, etc.
I think that's a separate issue. I'm not saying that Apple or any other service provider should not be considered a potential adversary. I'm saying it's still a good thing for service providers to implement solutions to threats.
I think the right way to advocate for this really is to focus on the warrant aspect. It’s not about preventing law enforcement but keeping it above board where there’s at least the possibility of oversight and targets can exercise their rights to things like legal representation.
I think it mostly matters in the context of US case law, specifically the third party doctrine.
> The third-party doctrine is a United States legal doctrine that holds that people who voluntarily give information to third parties—such as banks, phone companies, internet service providers (ISPs), and e-mail servers—have "no reasonable expectation of privacy" in that information. A lack of privacy protection allows the United States government to obtain information from third parties without a legal warrant and without otherwise complying with the Fourth Amendment prohibition against search and seizure without probable cause and a judicial search warrant.
There are multiple meanings of trust in this scenario: belief in honesty, and confidence of ability. Eg I can trust you to tell me the truth but not trust you to protect me from a missile.
I trust Apple’s honesty. I don’t trust many attack vectors. Someone could gain access to their data center. E2EE protects that. A gov could legally compel them to provide data. I trust when they say they’ve engineered it in such a way that they can’t currently do it, and that they would publicly cause a scene and legal battle if attempted-as they have before. Accidental data leaks also happen. In all these scenarios I trust Apples intentions but know that nothing is perfect. E2EE adds a lot for me.
Also, companies like Apple are huge, with thousands of staff.
These protections aren't there to protect you from "Apple", but Apple staff.
So for example if someone at Apple has been compromised by a foreign state, they can't copy sensitive customer data just willy nilly. They'd have to jump through a lot of hoops that would be prohibitively difficult.
Google had issues like this in the past where some employees were sending data to the Chinese government. E.g.: information about dissidents, political opponents in Taiwan, etc...
This is one of the reasons Google encrypts even internal server-to-server traffic, because the threat is on the inside of the firewall!
In theory it adds a speed bump. Apple as the cloud service provider can respond to the legal order by saying they don't have the key. And then the police can ask for a booby trapped update for just your phone which may or may not happen. Or they can lobby the legislature for an encryption backdoor for all devices which will force them to show their hand in terms of "lawful intercept" capability.
If you want maximum security use an air gapped computer. But that won't let you send messages on the go.
> If you want maximum security use an air gapped computer. But that won't let you send messages on the go.
You can, with some inconvenience, use optical diodes to transmit data from a trusted input device to an untrusted network device for transport over tor, and then push the received messages over a second diode to a display device that decrypts the messages, so that even if you receive an exploit/malware, there is no physical connection that allows unencrypted data to be exfiltrated.
If you want maximum security then just obviously don't use Apple services, or any other provider that has a capability to fetch your data under any circumstances.
Starting in May next year, the Digital Markets Act [1] requires Apple to "allow the installation of third-party software applications [...] by means other than the relevant core platform services of that gatekeeper."
I'm still on the fence about whether this will end up being a net good or not but people don't seem to consider the potential knock on effects of this. Apple puts some nice pro-consumer, along with some less nice anti-developer, requirements on Apps in the AppStore. Easy subscription management, privacy disclosure, parental controls etc. If the developers of an app decide to only make it available outside the AppStore you as a consumer may be forced to choose between using that app and getting those benefits.
> If the developers of an app decide to only make it available outside the AppStore you as a consumer may be forced to choose between using that app and getting those benefits.
And Apple already chooses the reverse for you by not allowing apps you may want and by charging at 30% tax for doing so. There is a vast disparity between the behaviors!
It won't help to download apps on an iPhone, which, I must say, isn't even yours: you don't get to decide which apps you can install on your phone. Apple gets to decide. Factually speaking you're merely renting the iPhonefrom Apple, which, being the device owner, decides the terms under which you can use it.
In practice this distinction is meaningless. In fact I trust Apple more than my own government. To take your argument to an absurd logical conclusion, I don’t own ANYTHING because my government can take it.
It is known that Apple would do quite a lot of what governments will ask of it. It removes app from national AppStores on a simple request from countries like China or Russia. (Well, now Apple might ignore Russian takedown requests, but prior to the war with Ukraine they were very receptive to their demands)
This is why side-loading and the option for alternative app stores is so crucial. If Apple bans Signal or other E2EE messenger apps from your national app store, you can't get them. Full stop.
If people in China and other privacy-hostile countries can side-load from alternative app stores (like F-droid for Android), the government/Apple doesn't control user access to particular undesireable apps.
There's obviously reverse concerns to this side of the coin but the overall concept has arguably always existed eith jailbreaking (Cydia store, AltStore(?)) and I haven't heard any stories about people becoming massively compromised in the way all the naysayers and Apple would have us believe.
Yes, I have heard of the GDPR and in my opinion it has improved/consolidated my digital privacy rights and not affected the "web browsing experience" in any negative way. I believe you are referring to the ePrivacy Directive (aka cookie law). As you may know, it's only mandatory to inform the user when the website is collecting information from the user beyond what is necessary for technical purposes - and in that case I do want the option to refuse that.
They don't have to lobby anyone for this. Apple has operations in aus. We have laws here gov can force you to put a backdoor in software or hardware and you are not allowed to tell even your employer you have been requested to do so.
Tbh in theory apple aren't allowed to tell you they have done it or otherwise. So their phones have probably been backdoored for a few years now at request of aus gov.
I would not be surprised if there is a backdoor already. Either explicitly ordered or secretly inserted like Dual_EC_DRBG. They’re not burning a zero day vulnerability or certificate authority just to convict one defendant. They’re saving them for something like Stuxnet.
Nothing is secure. Once we remember that, we'll stop nitpicking improvements.
Use your own server? Great, it's secure software-wise, but if someone broke into your house, it's all of the sudden the worst liability ever. The next thing you know, your entire identity, your photos, everything is stolen. You have excellent technical security, perhaps the weakest physical security.
So new plan, you use a self-hosted NextCloud instance on a VPS somewhere. That's actually not much smarter than using iCloud - VPSs handle data warrants all the time. They also move your data around as they upgrade hardware, relocate servers, and so forth.
So new plan, you use iCloud E2E encryption. You have to trust that Apple does as they say, and trust that their algorithms are correctly functioning. Maybe you don't want to do that, so new plan:
You use a phone running GrapheneOS, with data stored on a VPS, with your own E2E setup. Great - except you need to trust your software, and all the dependencies it relies on. Are you sure GrapheneOS isn't a CIA plant like ArcaneOS was? Are you sure your VPN isn't a plant, like Crypto AG? And even if the VPN is legitimate, how do you know the NSA doesn't have wiretaps on data going in and out, allowing for greatly reducing the pool of suspects? Are you sure that even if the GrapheneOS developers are legitimate, the CIA hasn't stolen the signing key long ago? Apple's signing key might be buried in an HSM in Apple Park requiring a raid, but with the GrapheneOS developer being publicly known, perhaps a stealth hotel visit would do the trick.
So new plan, you build GrapheneOS yourself, from source code. Except, can you really read it all? Are you sure it is safe? After all, Linux was nearly backdoored with only two inconspicuous lines hidden deep in the kernel (the 2003 incident). So... if you read it all, and verify that it is perfect, can you trust your compiler? Your compiler could have a backdoor (remember the "login" demo?), so you've got to check that too.
At this point, you realize that maybe your code, and compiler, is clean - but it's all written in C, so maybe there are memory overflows that haven't been detected yet, so the CIA could get in that way (kind of like with Pegasus). In which case, you might as well carefully rewrite everything in Rust and Go, just to be sure. But at that point, you realize that your GrapheneOS phone relies on Google's proprietary bootloader, which is always signed by Google and not changeable. Can you trust it?
You can't, and then you realize that the chip could have countless backdoors that no software can fix (say, with Intel ME, or even just a secret register bit), so new plan. You immediately design and build your own CPU, your own GPU, and your own silicon for your own device. Now it's your own chip, with your own software. Surely that's safe.
But then you realize there's no way to verify, even after delidding the chip, to verify that the fabrication plant didn't tweak your design. In which case, you might need your own fabrication plant... but then you realize that there's the risk of insider attacks... and how do you even know those chip-making machines are fully safe? How do you know the CIA didn't come knocking and make a few minor changes to your design, and then gag the factory with a National Security Letter from giving you any whiffs about it?
But even if you managed to get that far, great, you've got a secure device - how do you know that you can securely talk to literally anyone else? Fake HTTPS Certificates from Shady Vendors are a thing (TrustCor?). You've got the most secure device that is terrified to talk to anybody or anything. You might as well start your own Certificate Authority now and have everyone trust you. Except... aren't those people... in the same boat now... as yourself... And also, how do you know the NSA hasn't broken RSA and the entire encryption ecosystem with that supercomputer and mathematicians of theirs? How do you know that we aren't using a whole new DUAL_EC_RBG and that Curve25519 isn't rigged?
The rabbit hole will never end. This doesn't mean that we should just give up - but it does mean we shouldn't be so ready to nitpick the flaws in every step forward, as there will be no perfect solution.
Oh, did I mention your cell service provider always knows where you are, and your identity, at all times, regardless of how secure your device is?
Edit @INeedMoreRAM:
For NextCloud, from a technical perspective it's fantastic, but your data is basically always going to be vulnerable to either a technical breach of Linode, an insider threat within Linode, or a warrant served (either a real warrant, or a fraudulent warrant, which can happen).
You could E2E encrypt it with NextCloud (https://nextcloud.com/endtoend/) which would solve the Linode side of the problem, but there are limitations you need to look into. Also, if a warrant was served (most likely going to be authentic if police physically show up, at least more likely than one they served your data over), you could always have your home raided, recovery keys found, and data accessed that way. Of course, you could destroy the keys and only rely on your memory - but, what a thing to do to your family if you die unexpectedly. Ultimately, there's no perfect silver bullet.
Personally... It's old school, I use encrypted Blu-rays. They take forever to burn, but they come in sizes up to 100GB (and 128GB in rare Japanese versions), they are physically stored in my home offline, and I replace them every 5 years. This is coupled with a NAS. It's not warrant-proof but I'm not doing anything illegal - but it is fake-warrant-resistant and threats-within-tech resistant, and I live in an area where I feel relatively safe (even though this is, certainly, not break-in-proof). Could also use encrypted tape.
I run Nextcloud on a RPI at home with fail2ban, brute force protection, MFA, and E2EE which is backed up remotely using encrypted Borg backup. The 4TB SSD drive safely serves my friends and family too. My laptop and Graphene phone's files, apps and settings are backed up automatically to it daily. I have too many apps installed on Nextcloud to list, but it is basically an all in one solution to your cloud needs.
Both Nextcloud and GrapheneOS are FOSS which addresses your concern about it being a government trap.
My partner is able to access my Bitwarden account if I were ever to be indisposed.
Sure nothing is perfect, but tell me how this is not a better solution than trusting the closed source ecosystem of the biggest corporation in the world.
“Both Nextcloud and GrapheneOS are FOSS which addresses your concern about it being a government trap.”
I was merely referring to the fact that unless you build the code yourself, there is no certainty that you have that a government has not shipped a custom hacked build to your device and stolen a FOSS signing key. Unlikely? Yes. Possible? Yes. Also, backdoors, as seen in the 2003 Linux incident, can be as hidden as a deliberately missing equals sign in 1 line of code - so, a sneaky government commit with the smallest backdoor could be undetected even if FOSS. I still think it’s better than proprietary - don’t get me wrong - but it’s not invincible which was my main point about how security does not end.
Right, but nobody can write all the code they need for every service. I agree nothing is invincible. We put varying degrees of trust in people and processes of communities who maintain the SW. FOSS requires much less trust than proprietary SW developed by megatech.
> Use your own server? Great, it's secure software-wise, but if someone broke into your house, it's all of the sudden the worst liability ever.
this doesn't invalidate the rest of your point, but if your data isn't encrypted at rest on your own hardware, that one very particle point? that's your own fault.
you will need some kind of remote mounting mechanism. Imagine you are abroad and your power at home is off for a short period of time. How to boot remotely and mounting the encrypted filesystem?
Not an easy task. You will need some kind of dropbear ssh that you dial into and input your encryption key. Many moving parts. Don't get me started if you have to update the packages due to security fixes.
I've been running my own Nextcloud instance on a Linode with 2FA and your response made me question how secure it is.
Even though I get an A+ on the Nextcloud Security Scan (https://scan.nextcloud.com/), have 2FA, and custom IP blocking set up in my .htaccess file, it's disheartening to know that I'm not as secure as I thought I was.
I removed all my photos/files from iCloud for privacy reasons, and now I feel helpless contemplating how Linode may just hand my data over if served a warrant.
Any other Nextcloud hardening tips besides Fail2ban and reverse proxying you'd recommend? May I ask what your workflow looks like for preserving files throughout time?
Nextcloud has three recommended add-ons that you can install in a few clicks:
-Brute force protection
-End to end encryption
-Multi-factor Authentication
> And what good is that encryption, if Apple obviously can do almost anything with your device?
Because apple isn’t in control of apple for data at rest, and that’s the specific risk.
You have to trust control of the device sure, but you cannot trust cloud data - almost at all - between subpoenas from over eager LEOs and break ins from criminal and state hackers
> Because apple isn’t in control of apple for data at rest
That's not really true if Apple also holds copies of your iCloud decryption keys. If they want to access your data, they already have all the necessary components.
Now we're going in full circle, so I'll just point you to the parent thread:
> One must understand that E2EE is used when you don't trust your service provider to handle your data. In other words, the adversary in your threat model is the service provider - and in this case, Apple. And what good is that encryption, if Apple obviously can do almost anything with your device?
Ironic, since if you follow the thread you'll learn that since Apple still has complete control of your device, it essentially still has access to the keys.
Let me re-phrase, by giving Apple control over the keys, you give control over the data to whoever controls apple - which is non-zero (Eg. LEO), and whoever may gain control (security vuln).
Apple isn't a monolithic entity. For example, a rogue engineer might be able to access your iCloud data, but it's orders of magnitude more complicated to push a specifically manufactured app to your device.
There's a similar variance of complexities for hacking and law enforcement overreach scenarios.
E2EE isn't a solution for all attack vectors, but it's a significant mitigation in itself.
Technically no. I still have Fortnite on my iPhone, it just can't be opened. Apple can't wipe apps from your phone, but if they're App Store installed (as opposed to Ent MDM/Sideloaded), they can render them inoperable by revoking the certificate attached to the bundle.
It's all a closed source jumble though. Even if they can't do it right now, they have the power to install an update that allows them to add that power, if they had to.
What's the functional difference between "remotely deleting" and "remotely rendering inoperable"?
Remotely deleting probably just exposes them to all kinds of legal issues, since it would wipe user data too (which you can otherwise possibly still extract, e.g. through the "Files" app).
What’s missing is context - Fortnight’s account is in breach of the agreement and can’t deliver updates to address issues with the latest version of iOS.
This is identical to any developer that doesn’t deliver updates or suspends their developer account.
Those which have downloaded Fortnight at least once can still download and use the game on earlier versions of iOS and even with iOS 16 by following certain mitigations.
Contrary to some online posts Apple haven’t done anything unique to the fortnight account.
One must also understand that you're wrong. My threat model isn't Apple. My threat model is
a) Overreaching law enforcement, which want to take a look at what I'm up to.
b) Data breach at Apple exposes all my data
c) Errors where my pictures gets in another users photo album, as seen on Google Photos once.
They can remotely wipe apps. They can force-install apps and force updates. It is not too far-fetched to think that they can just remotely copy anything stored on your device to their servers. So, with an adversary that capable, I'm not sure encrypted backups provide a meaningful improvement to security and privacy.