You have to install their MDM device management software on your computer. Basically that computer is theirs now. So don't plan on just handing over your laptop temporarily unless you don't mind some company completely owning your box. Still might be a validate use for people with slightly old laptops lying around, but beware trying to share this computer with your daily activities if you e.g. use a bank on a browser on this computer regularly. MDM means they can swap out your SSL certs level of computer access, please correct me if I'm wrong.
MDMs on macOS are permissioned via AccessRights, and you can verify that their permission set is fairly minimal and does not allow what you've described here (bits 0, 4, 10).
That said, their privacy posture at the cornerstone of their claims is snake oil and has gaping holes in it, so I still wouldn't trust it, but it's worth being accurate about how exactly they're messing up.
You are right - the "nonce binding" the paper uses doesn't seem convincing. The missing link is that Apple's attestation doesn't bind app generated keys to a designated requirement, which would be required to create a full remote attestation.
> If you can prove a public key is generated by the SEP of a machine running with all Apple's security systems enabled, then you can trivially extend that to confidential computing because the macOS security architecture allows apps to block external inspection even by the root user.
It only effectively allows this for applications that are in the set of things covered by SIP, but not for any third-party application. There's nothing that will allow you to attest that arbitrary third-party code is running some specific version without being tampered with, you can only attest that the base OS/kernel have not been tampered with. In their specific case, they attempt to patch over that by taking the hash of the binary, but you can simply patch it before it starts.
To do this properly requires a TEE to be available to third-party code for attestation. That's not a thing on macOS today.
I wiped my post because you are right. I don't think it needs a full SGX-style TEE. What's missing is a link to designated requirements. Abusing a nonce field doesn't seem to work, or if it does I can't figure out how. The MDM/MDA infrastructure would need to be able to include:
public key from SEP -> designated requirement of owning app binary
The macOS KeyStore infrastructure does track this which is why I thought it'd work. But the paper doesn't mention being able to get this data server side anywhere. Instead there's this nonce hack.
It's odd that the paper considers so many angles including things like RDMA over Thunderbolt, but not the binding between platform key and app key.
Reading the paper again carefully I get the feeling the author knows or believes something that isn't fully elaborated in the text. He recognizes that this linkage problem exists, proposes a solution and offers a security argument for it. I just can't understand the argument. It appears APNS plays a role (apple push notification service) and maybe this is where app binding happens but the author seems to assume a fluency in Apple infrastructure that I currently lack.
I can buy the idea that if you can have the MDM infrastructure attest the code signing identity through the designated requirements, that you can probably come pretty close, but I'm still not quite sure you get there with root on macOS (and I suspect that this is part of why DCAppAttest hasn't made it to macOS yet).
Certainly, it still doesn't get you there with their current implementation, as the attempts at blocking the debugger like PT_DENY_ATTACH are runtime syscalls, so you've got a race window where you can attach still. Maybe it gets you there with hardened runtime? I'd have to think a bit harder on that.
Yeah I didn't quite understand the need for PT_DENY_ATTACH. Hardened runtime apps that don't include get-task-allow are already protected from debugger attach from the start of the process, unless I misunderstood something.
I'm not quite sure why Apple haven't enabled DCAppAttest on macOS. From my understanding of the architecture, they have every piece needed. It's possible that they just don't trust the Mac platform enough to sign off on assertions about it, because it's a lot more open so it's harder to defend. And perhaps they feel the reputational risk isn't worth it, as people would generalize from a break of App Attest on macOS to App Attest on iOS where the money is. Hard to say.
They can never have supervised permissions on your device unless your mac was wiped and registered in Apple Business Manager. As a BYOD, it cannot be locked.
Whether it makes sense to enroll a mac as a BYOD is another question.
There are many people who do not have ready access to a million dollars to purchase said Mac minis, much less the operating capital to rack & operate them.
Very smart play to build a platform, get scale, and prove out the software. Then either add a small network fee (this could be on money movement on/off platform), add a higher tier of service for money, and/or just use the proof points to go get access to capital and become an operator in your own pool.
If those numbers are true, they could tart with one Mac and can double every few months. But, I guess there are also many people who do not have ready access to whatever a Mac mini costs either...
Those tools aren't floating in the ether: someone has to go download it and run it in some way, automated or otherwise. I think the suggestion is to make that a step before publication as the post suggests.
IMHO, this is one of the coolest aspects of Clojure: the data structures were designed to be immutable as efficiently as possible. This allows for some of the most critical aspects of the language to function. If you're whole program passes around values that are immutable, you don't ever have to worry about which part of the code owns what. Rust is such a different language but the borrow-checker is solving a nearly identical problem: who owns this memory and who is allowed to change it? In Rust, one owner ever gets to write to a piece of memory and you can move that around. It catches lots of bugs and is extremely efficient, but it is definitely a difficult aspect of the language to master. Clojure just says everyone gets to read and share everything, because every mutation is a copy. One piece of your codebase can never stomp on or free some memory that another part is using. Big tradeoffs with garbage collection but geeze does it make reasoning about Clojure programs SO much easier. And the cornerstone of that idea is fast, immutable data structures by default. Maps are implemented similarly in that they are fast to make mutated copies with structural sharing to save memory.
It's surprising to me how well these go together, the transient concept in clojure is essentially a &mut, couple that with a reference counter check and you get fast transient mutations with cheap persistent clone.
All of rusts persistent immutable datastructure libraries like im make use of this for drastically more efficient operations without loss in capability or programming style.
I used the same principle for my rust re-imagination of datascript [1], and the borrow checker together with some merkle dag magic allows for some really interesting optimisations, like set operations with almost no additional overhead.
Which allows me to do stuff like not have insert be the primary way you get data into a database, you simply create database fragments and union them:
let herbert = ufoid();
let dune = ufoid();
let mut library = TribleSet::new();
library += entity! { &herbert @
literature::firstname: "Frank",
literature::lastname: "Herbert",
};
library += entity! { &dune @
literature::title: "Dune",
literature::author: &herbert,
literature::quote: ws.put(
"I must not fear. Fear is the mind-killer."
),
};
ws.commit(library, "import dune");
The entity! macro itself just creates a TribleSet and += is just union.
In Clojure this would have been too expensive because you would have to make a copy of the tries every time, and not be able to reuse the trie nodes due to borrow checking their unique ownership.
Most languages have some sort of peristent/immutable data library available. But that's not the same as having it built in! If you have to weave 3rd party types through the entire application and convert on behalf of libraries that aren't compatible. There is an enormous benefit to having these data structures as first class citizens.
And thank you for highlighting the conceptual link to Rust's borrow checker. Clojure and Rust are far and away my two favorite languages and this largely articulates why: they both have language-level constructs to completely solve the data ownership problem. Not just "here's some tools to optionally solve it, good luck" but truly, fully solve it. Full stop.
Almost every other language leaves ownership enforcement up to the developer. Clojure and Rust take entirely different approaches to virtually everything, but they converge on this one point and it's a crucial one. Types, syntax, compilation models, garbage collection, etc. are all secondary concerns to managed data ownership IMO.
Rich Hickey, Alex Miller, Stu Holloway, Nathan Marz, and the other people in this video are all very impressive, intelligent people. But never undercount how much if their success is due to consistent effort. They worked hard at something they cared about with depth, and also longevity. Anyone can learn Clojure with a bit of skill and tenacity. Anyone can contribute to the community in a meaningful way. It takes effort though. Please join us and contribute back to the community in ways that help you scratch an itch, give it some polish, and then share it. Or join the forums on Clojurians on Zulip. Come to Clojure/Conj, it was fantastic last year.
The only reason the deep state or anyone has any power is because most people don't care. If people cared, we could change. Modern politics is all about distracting everyone with some crazy as often as possible to keep shifting attention and basically disabling any progress.
the deepstate has power because they will literally kill you if you don't and that's not the worst option. The deepstate will honeytrap, hack, blackmail, or otherwise destroy your life to get what "it" wants. People caring more isn't going to do anything if the Congressman doesn't want it known that he likes easy access to money and other illegal things.
I think the effect is actually backwards: there may only be 2 windows instead of 4 but the total amount of time window per year should theoretically go up significantly. The 2 removed reports should make both of those quarters less subject to insider trading and therefore more tradeable.
In companies I've been in, insider trading windows close because there's been a certain amount of time since the last report. So less frequent reports = more time for insider to know things that aren't public yet = more time unable to trade, not less.
reply