They said that they implemented x86 ISA memory handling instructions, that substantially sped up the emulation. I don't remember exactly which now, but they explained this all in a WWDC video about the emulation.
Not instructions per se. Rosetta is a software based binary translator, and one of the most intensive parts about translating x86 to ARM is having to make sure all load/store instructions are strictly well ordered. To alleviate this pressure, Apple implemented the Total Store Ordering (TSO) feature in hardware, which makes sure that all ARM load and store instructions (transparently) follow the same memory ordering rules as x86.
Intel customers required a second source supplier, i.e. IBM, thus, AMD was providing that for Intel in the beginning. Then later on AMD created the x86 64bit commands, which Intel adopted from AMD so now both share the same ISA.
Customer needs don't really matter in cases where monopolist (ab)uses the law to kill competition. That's the MAIN reason why monopolies are problematic.
The "required" in that sentence should be read strictly: some customers, mainly governmental, wouldn't have bought Intel chips in the first place without access to alternative suppliers (AMD and previously VIA). Intel had to give in.
Neither company were like they are now back then. Intel needed a second supplier for their chips because nobody trusted manufacturing from a single source provider.
Their math is incorrect.
The M1 scores about 300K in Coremarks and has 8 cores => 37.500 per core. M1 consumes about 12W => 1.5 W/core and thus: 37.500 ÷ 1.5 W/core = 25 K Coremarks per Watt.
But this company (or Andy Huang) is claiming M1 has 100 Coremarks per Watt!
Why don’t people check their math if it sounds far off?
I saw multiple people (outside of HN) come up with similar numbers for the M1. The numbers are impressive (2x-3x more efficient than M1) but the extreme dishonesty makes the design look like snakeoil. If you're ahead why lie? Probably because the design isn't actually as good as the press release implies.
And this makes sense. They are counting the value paid per 1000 plays. If most of users in spotify are free, the average will be lower. Also, spotify has more plays than apple, so the final value paid might be higher.
This makes me wonder if $ per play is the right metric, versus $ per artist per platform. If you get paid less per play on Spotify, but get many more plays there would that still be a better deal?