Intels problem with new architectures has always been buy in. Apple has repeatedly been able to swap underlying architectures because they also make compatibility layers, go all in, and have a culture of adapt or die with applications. Yes, some people get boned by old apps getting deprecated, but for most it means devs are encouraged (forced) to keep their stuff updated and new. Look how fast and easy the m1 switch was.
If intel rolls something out with 0 x86 compatibility and just hopes people will write for it, it’ll never happen, it’s chicken and egg. If the new hotness doesn’t provide enough benefit for the cost investment, then it’ll also flop.
But, if it’s well supported and the benefits are huge and there is an easy migration/upgrade/conversion path, then people will adopt it.
AWS got their own chips into machines because it was largely transparent to end users, cheaper, and provided real benefits.
Intels problem with new architectures has always been buy in. Apple has repeatedly been able to swap underlying architectures because they also make compatibility layers, go all in, and have a culture of adapt or die with applications. Yes, some people get boned by old apps getting deprecated, but for most it means devs are encouraged (forced) to keep their stuff updated and new. Look how fast and easy the m1 switch was.
If intel rolls something out with 0 x86 compatibility and just hopes people will write for it, it’ll never happen, it’s chicken and egg. If the new hotness doesn’t provide enough benefit for the cost investment, then it’ll also flop.
But, if it’s well supported and the benefits are huge and there is an easy migration/upgrade/conversion path, then people will adopt it.
AWS got their own chips into machines because it was largely transparent to end users, cheaper, and provided real benefits.