I wish I had the expertise to do such in-depth reverse engineering of firmware blobs.
The DCP is actually the thing that's stopping me from providing native brightness control on the HDMI port of the newer Macs inside Lunar (https://lunar.fyi). Users have to either switch to a Thunderbolt port to get native brightness control for their monitor, or use a software dimming solution like Gamma Table alteration.
It's not clear what's going on, but it seems that the HDMI port of the 2018+ Macs uses an MCDP29xx chip inside, which converts the HDMI signal to DisplayPort internally, so that Apple doesn't have to decode both HDMI and DP video signals. (that is also the reason why even the newest MacBook and Mac Studio have only HDMI 2.0, that's the most the converter chip supports [0])
When sending DDC commands through the IOAVServiceWriteI2C call, monitors connected to the HDMI port lose video signal, or flicker or completely crash and need a power cycle to get them back.
The Thunderbolt ports however send the DDC command as expected when IOAVServiceWriteI2C is called
After @marcan42 from Asahi Linux pointed out [1] the DCPAVFamilyProxy kexts, I've looked into it and I found some different writei2c methods and some MCDP29xx specific code, but no clue on how to call them from userspace.
I guess I'll have to look into how the analysed exploit is using the RPC, and also check the methods assembly from inside the firmware blob itself. I was not aware that most userspace methods are now shims for remotely calling the embedded code.
Linking this next time somebody tries to tell me iOS's limitations on sideloading improve security.
In reality it costs the bad guys $299 to bypass this limitation, while your average user is locked out of this feature.
Installing an enterprise app requires the user trust the cert (with a scary warning shown). Also this makes a better case for not allowing sideloading since the sandboxing isn't perfect but the app store review process makes it harder to sneak one by.
> In reality it costs the bad guys $299 to bypass this limitation
And enterprise certs aren't so easy as "just give Apple $299", try and get one then get back to me.
> Also this makes a better case for not allowing sideloading since the sandboxing isn't perfect but the app store review process makes it harder to sneak one by.
The point I'm trying to make is that Apple isn't consistent here. If they actually believed this to be true there would be no way to get a sideloading cert that worked on all devices.
"Sideloading is dangerous because users could install malware" and "We'll let any iPhone sideload your app if you jump through some hoops and pay us" are incompatible statements, but Apple makes both of them.
> And enterprise certs aren't so easy as "just give Apple $299", try and get one then get back to me.
Obviously not, but the linked post demonstrates attackers are more than capable of getting one.
But surely you see the difference between "revokable enterprise cert that requires a verification process to obtain" and "anyone can sideload"?
I don't see this as Apple making incompatible statements.
> I'm sure the fact this setup forces all apps to go through Apple's app store and pay them 30% of their revenue is just a unfortunate accident.
Does a normal enterprise charge its users to install internal apps? If not, this seems like an odd complaint.
> My point is that if Apple truly believed sideloading was a risk to user security there wouldn't be certificates that let you sideload on all devices. They would force companies to provide a list of devices or only allow devices enrolled in their MDM or some similar reasonable restriction.
That last part is effectively what they're doing: you have to do the same level of scary prompt and authentication to install an enterprise certificate as you do to enroll for MDM. Yes, users can still be socially engineered to compromise their device but that's an order of magnitude harder than just convincing someone to run some random binary.
Something can be simultaneously effective, and also fail at least once.
The thing is that when the system is effective at stoping the malware, you don't get a signal about it. You only see the ones that somehow made it. So it's not easy to have an idea of the effectiveness.
To a layman? No difference at all. Remember that Apple claim to be the saviors of the ignorant.
> "Sideloading is dangerous because users could install malware" and "We'll let any iPhone sideload your app if you jump through some hoops and pay us" are incompatible statements, but Apple makes both of them.
I'm missing how these are incompatible statements. You might as well say "if Apple thought certain apps could be dangerous they wouldn't have an App Store or even allow their own first-party apps on iPhones." Of course the position of Apple is that they should be able to approve every app/vendor before it's allowed to be installed on iPhones. The fact that they do approve some apps and vendors isn't incompatible with their position.
> The point I'm trying to make is that Apple isn't consistent here. If they actually believed this to be true there would be no way to get a sideloading cert that worked on all devices.
Indeed. I'm (maybe) willing to sideload an app from my employer but not from some other company. Perhaps only managed devices should have this capability, and only for apps signed by the managing authority.
I won't allow my employer to manage my personal phone; they have to issue me a phone if they want that kind of control. And in that case it's their device; they are welcome to manage it as they see fit.
Also: this is different from TestFlight.
The warning looks like this: https://support.apple.com/library/content/dam/edam/applecare...
EDIT: Formerly I asked: What does the scary warning look like? Is there a screenshot of an example? I would like to show this to my employees and family and tell them never to trust such a cert.
Oops, just saw your edit. Well I just spent the last 10 minutes or so documenting the flow so I'll post it anyway: https://imgur.com/a/ofvfty8
Thank you for documenting it!
At least it doesn't have "Accept Anyways" type of button. The only option is to cancel whatever it was being attempted.
Here is the full flow: https://imgur.com/a/ofvfty8
> but the app store review process makes it harder to sneak one by.
Imo the key question is: If you can find an exploit in the iOS sandbox, will the app store review really stop you? Compared to the expertise required to find such an exploit, it should be pretty trivial to obfuscate it or load the payload remotely after install.
> Imo the key question is: If you can find an exploit in the iOS sandbox, will the app store review really stop you?
The App Store review is two-pronged -- it includes some human QA testing, and it also includes some automated screening of your executable, which includes checking for any suspicious library imports or strings. (This sometimes trips up applications which happen to use method names which happen to match up with Apple's internal APIs.) While it isn't entirely impossible for a malicious application to slip past this examination, it's significantly harder than for an Enterprise application which doesn't go through this process at all.
App Store Review’s automated scanning is quite trivial to get around, most iOS developers can either attest to doing so themselves or knowing someone who has done this.
> And enterprise certs aren't so easy as "just give Apple $299", try and get one then get back to me.
I believe you, but... so how does a "commercial spyware" company get one, and has it been cancelled by Apple after they see what they are up to?
Or do their not-so-easy requirements... allow "commercial spyware" companies to fraudulently impersonate other apps with the cert?
You can buy enterprise certs from other companies, that's how the stores like jailbreaks.fun (safe) or AppValley or AppCake (very questionable) get them.
There's malware on the AppStore. The review process is completely incapable of catching it.
Well people are being trained to install this due to MDM.
I couldn't agree with you more. I certainly see some very valid arguments for why Apple should allow sideloading on iPhones, but I am baffled by this extremely common argument that goes "here's an example of Apple not being restrictive enough to protect people, therefore Apple shouldn't be restrictive at all."
You're observing 'motivated reasoning', where someone started with the result (Apple is wrong to disallow side-loading), and came up with a way to support it.
It's next to impossible to get the clearance for an Apple Developer Enterprise account unless you know someone at Apple. It's necessary to have an Enterprise account to sign MDM certificates, so I've had an application open for over six months without hearing from them, and the first application was rejected after 10 months without any dialogue.
With this article shining yet more negative light on the program after the Facebook/Google spying-on-the-internet-access-of-kids debacle effectively shut the Enterprise program down, the MDM space will be even harder to innovate in, considering no startup will ever meet the required bar for signing up for an Enterprise account.
> Linking this next time somebody tries to tell me iOS's limitations on sideloading improve security.
While I wouldn't say that's exactly wrong, if this type of thing happened often I would think that we wouldn't see Google writing this up as interesting. Doesn't seeing this mean it is a rare and noteworthy event and more evidence that iOS's limitations on sideloading do improve security? I'm not sure how often this happens, so I could be way off.
> Linking this next time somebody tries to tell me iOS's limitations on sideloading improve security.
This IS a case of using an enterprise certificate to sideload an app.
If anything, this proves that sideloading, even with scary warnings to the users, enables malware.
> In reality it costs the bad guys $299 to bypass this limitation
They also had to get verified as a mid-sized or greater corporation (Apple does use real verification partners), and further Apple can pull the rug on the certificate in an instant, immediately invaliding that $299 certificate and the considerable effort that went into getting it.
In reality this group likely had to hack an existing Apple Enterprise approved business first, then using that to springboard to the next step.
Casually dismissing that enormous gate is pretty tenuous.
I don't have any inside details on this case, but I highly suspect the signing certificate was stolen from a legitimate user.
Organizations sophisticated enough to build something like this already target organizations like device manufactures to get kernel driver signing certificates on Windows.
Considering how much apple hardware costs, I find it difficult to believe that the average apple user wouldn't be able to scrape together 299$. That's about half the price of a budget iPhone in my area.
The current-gen iPhone SE (i.e. what you'd likely buy if you wanted a budget iPhone) costs $429 here in the US. Additionally, you often can get an even better deal if you're willing to sign a contract with a carrier.
Even if I take your prices at face value, I'm not sure "Apple charges 1/2 the price of the phone to unlock sideloading" to be a killer argument.
We aren't talking about the average user - this is the licence for a medium-sized enterprise licence to write/develop their own apps.
You need to be a verified business to get this.
The exploit is quite complicated to pull together. Would there be any chance that someone created it based on iOS sources? I assume NSO and such actors would already have bought stolen source codes.
Be a competent reverse engineer and you have the sources to everything without the need to stole it.
The people that build these things are just that good.
You don't need stolen source, just knowledge.
Grayshift, a company that makes devices that unlock iPhones for law enforcement, was started by two ex-Apple security engineers.
> This sideloading works because the app is signed with an enterprise certificate, which can be purchased for $299 via the Apple Enterprise developer program.
From linked post, the actor is identified as a "commercial spyware" company.
So... I'd like to assume that Apple has cancelled their enterprise cert and will refuse to sell them another after this abuse, right? Surely there are terms of service that forbid using an enterprise cert maliciously, to fraudulently pretend to be another app and trick users that are not part of your "enterprise"?
Right? (cue Anakin and Padme meme).
But seriously... will they?
They probably will cancel it. Seeing as they cancelled Google's certificate at one point.
if that's so "easy" for enterprise to get side loading to work, why eg. epic games won't go that route to provide apps outside app store? am i missing something?
Apple has a number of requirements that enterprises must meet [1] in order to be eligible for an enterprise distribution certificate. Apple can revoke the certificate at any time if they discover misuse, which would quickly become obvious if a large company such as Epic Games started publicly distributing their games this way.
Apple revokes enterprise certs that it discovers being used to distribute apps to users outside of said enterprise.
What's interesting to me is that on its face Apple's architecture was the right thing from the perspective of modern security thought: split out "driver" layers that can be driven (in a reasonably direct way) by untrusted code and put them on dedicated hardware. That way you're insulated from the Spectre/Meltdown et. al. family of information leaks due to all the cached state on modern devices.
Except the software architecture needed to make this happen turns out to be so complicated that it effectively opens up new holes anyway.
(Also: worth noting that this is a rare example of an "Inverse Conway's Law" effect. A single design from a single organization had complicated internal interconnectivity, owing to the fact that it grew out of an environment with free internal communication. So an attempt to split it up turned into a mess. Someone should have come in and split the teams properly and written an interface spec.)
Are there user clients exposed for those kexts?