Assuming they’re telling the truth, they’ve successfully built one chip from that fab. That’s good, but it doesn’t mean the fab is capable of manufacturing at scale while turning a profit.
They need an external customer for the fab so they can iterate and work out the issues. It’s anyone’s guess if someone trusts intel to manufacture on their behalf instead of sticking with an established player. They’re stuck in a chicken and egg situation - can’t reach high yields without a customer, but a customer only wants to sign up if the yields and future deliveries are guaranteed.
Intels only hope might be that someone, not naming names, coerces an established company to sign up.
This is common in industry. You often do give a discount and guarantees to the first users of a system to compensate for the risk the customer is taking.
This is part of how DigitalOcean got going, Kingston gave a huge discount on a traditional HDD order if the order was switched to SSD instead because they wanted to kickstart scaled manufacturing. First time an SSD was put in and the IOPS was measured, the product direction was clear, at the time we thought it might be a CDN tho, but eventually landed on a "cloud hosting provider".
I think that's the industry's viewpoint as well. Intel's fabs' biggest customer was Intel. They're not doing well, so they're not fabbing as much especially at the leading edge. It'll death spiral.
That's too pessimistic. In general, customers don't want to be dealing with a monopolist and foundry customers are no different. It's in everyone's interest to solve the unproven process problem, so if Intel has evidence that the process isn't bust, customers will find a product which can be used as a pipe cleaner for mutual benefit.
Apple is similarly paranoid about single-sourcing -- off the top of my head I'm not sure whether their top-end M-class chips are currently fabbed by both TSMC and Samsung, or just TSMC>
Because if there was only a single source (for example if the other one was out-competed), they'd have to pay 30% of their revenue for the privilege of being in the FabStore.
They always are the first ones to use the most advanced node by TSMC, the designs probably are only compatible with that particular process. Have not heard of apple using samsung for SoCs.
If we assume that intel gets successful with 18A with their x86 processors, would they even have the money to finance the node after that? And the node after that which gets exponentially more expensive?
In the past x86 raked in enough money to burn a lot of it on new fab tech but non-x86 has grown immensely and floods TSMC with money. The problem for intel is that their fab tech was fitted to their processor architecture and vice versa. It made sense in the past but in the future it might not. For the processor business it may be better to use TSMC for production. For the fab it may be necessary to manufacture for many customers and take a premium for being based in a country in need. So, a split-up may be inevitable and this fabbing a competitive ARM chip surely helps in attracting more customers. Customers who may pay a premium for political and security reasons.
Apple, Nvidia and US govt can provide the required funds if they have confidence in its ability to deliver. These companies will benefit from breaking current monopoly of TSMC.
Samsung is already in a much better position for this. They have external customers and experience facilitating them. Unlike Intel's track record which doesn't inspire confidence at all.
Intel has something Samsung doesn't. It's a US company operating mostly on US soil so the US government has a vested interest to keep this strategic asset going for as long as possible.
Probably Intel’s fumble when Apple asked them for better performance per watt for the laptop CPUs and whether they wanted the iPhone CPU business back in 2006.
Probably the Intel CPUs in Macbooks before Apple made the push for the M1 - circa the Intel quad core era where their laptop chips had major heat issues... ~2012 IIRC?
If they didn’t have one already they would have presumably acquired one when they bought Altera - they had SoC FPGAs that have ARM cores hooked up to an FPGA fabric.
They have since spun off Altera but I imagine they’d still have a license.
I don't think Intel plans to make a product, but to prove they can build a working chip that's not one of their own design. Being ARM has fewer developmental risks than a RISC-V design and make validation easier.
Why is Intel manufacturing an Arm SoC as a reference platform? Probably because it's trying to attract external customers, and there's a whole lot more companies building Arm SoCs than there are firms pitching x86-64 processors.
They're not trying to build the next best thing. They're trying to attract customers.
Very unlikely to happen but Intel could release an Arm chip with native x86 translation. Arm and AMD IP would be needed but this would be the best chip for Windows
I don't understand what the difference is between "an ARM chip with native x86 translation" and a dual-ISA x86 and ARM chip.
And I don't understand why you'd want a dual-ISA x86 and ARM rather than just an x86 chip. You wouldn't get whatever CPU front-end simplicity advantages there are from ARM, since your front-end would get significantly more complex and consume significantly more transistors than with a normal x86 chip. And I don't think there's a market of people who want ARM for compatibility reason; any Windows software which supports ARM also supports x86.
What they could do is to release an ARM chip with a slightly extended ISA to add the select features which are difficult to emulate in software, such as loads and stores with the memory ordering guarantees x86 provides but ARM doesn't. Apple does this AFAIK, and it's one part of why Rosetta 2 is so good. But any ARM CPU maker could do this.
I wonder if ARM instructions could be translated to Intel’s uOps. Then everything except that translation could be shared. And, since programs consist entirely of one type of instruction for the most part, we could imagine that the chip should be able to stick to just doing one type of translation for the duration of a program run, rather than having to figure it out for each instruction.
I’m not saying I want this, but it might be surprisingly not totally impractical.
Denver does it because it was supposed to be an x86 CPU, but they couldn't get an agreement with Intel for patent licensing, so they pivoted into being the first available aarch64 CPU since decode was happening entirely in software.
> I don't understand what the difference is between "an ARM chip with native x86 translation" and a dual-ISA x86 and ARM chip.
Look at Apple's Rosetta 2 for an example. M-series Apple Silicon has special undocumented modes that mirror x86 architectural quirks that don't usually exist in ARM, in order to support AOT-translated machine code. The chip doesn't support x86 instructions, but it has the amenities to support x86 code. That could be what "native x86 translation" meant?
That's what I suggested in my comment's last paragraph. I don't think that counts as "an ARM chip with native x86 translation", but really the only person who can say whether that's what dlojudice meant is dlojudice.
Assuming they’re telling the truth, they’ve successfully built one chip from that fab. That’s good, but it doesn’t mean the fab is capable of manufacturing at scale while turning a profit.
They need an external customer for the fab so they can iterate and work out the issues. It’s anyone’s guess if someone trusts intel to manufacture on their behalf instead of sticking with an established player. They’re stuck in a chicken and egg situation - can’t reach high yields without a customer, but a customer only wants to sign up if the yields and future deliveries are guaranteed.
Intels only hope might be that someone, not naming names, coerces an established company to sign up.
Isn't the traditional solution to offer a really big rebate to the first customer?
Like 75% off for the first run of chips?
This is common in industry. You often do give a discount and guarantees to the first users of a system to compensate for the risk the customer is taking.
This is part of how DigitalOcean got going, Kingston gave a huge discount on a traditional HDD order if the order was switched to SSD instead because they wanted to kickstart scaled manufacturing. First time an SSD was put in and the IOPS was measured, the product direction was clear, at the time we thought it might be a CDN tho, but eventually landed on a "cloud hosting provider".
I think that's the industry's viewpoint as well. Intel's fabs' biggest customer was Intel. They're not doing well, so they're not fabbing as much especially at the leading edge. It'll death spiral.
That's too pessimistic. In general, customers don't want to be dealing with a monopolist and foundry customers are no different. It's in everyone's interest to solve the unproven process problem, so if Intel has evidence that the process isn't bust, customers will find a product which can be used as a pipe cleaner for mutual benefit.
Specially companies like Nvidia for which the gross profit margin is so high their risk of losing TSMC is higher than risk of losing money.
Apple is similarly paranoid about single-sourcing -- off the top of my head I'm not sure whether their top-end M-class chips are currently fabbed by both TSMC and Samsung, or just TSMC>
Because if there was only a single source (for example if the other one was out-competed), they'd have to pay 30% of their revenue for the privilege of being in the FabStore.
They always are the first ones to use the most advanced node by TSMC, the designs probably are only compatible with that particular process. Have not heard of apple using samsung for SoCs.
Apple used Samsung through the A7. Moved to TSMC for the A8.
> They need an external customer for the fab so they can iterate and work out the issues.
I guess you mean Intel to iterate using its own money to get the customer's chip right, no?
that customer could've been apple. since they used to have a close relationship, till intel shit the bed.
> Intel is effectively saying "Hey, we can make Arm chips!"
Makes sense since they were once popular in the NUC space and Apple has shown high-end ARM has a market.
If we assume that intel gets successful with 18A with their x86 processors, would they even have the money to finance the node after that? And the node after that which gets exponentially more expensive?
In the past x86 raked in enough money to burn a lot of it on new fab tech but non-x86 has grown immensely and floods TSMC with money. The problem for intel is that their fab tech was fitted to their processor architecture and vice versa. It made sense in the past but in the future it might not. For the processor business it may be better to use TSMC for production. For the fab it may be necessary to manufacture for many customers and take a premium for being based in a country in need. So, a split-up may be inevitable and this fabbing a competitive ARM chip surely helps in attracting more customers. Customers who may pay a premium for political and security reasons.
Apple, Nvidia and US govt can provide the required funds if they have confidence in its ability to deliver. These companies will benefit from breaking current monopoly of TSMC.
Samsung is already in a much better position for this. They have external customers and experience facilitating them. Unlike Intel's track record which doesn't inspire confidence at all.
Intel has something Samsung doesn't. It's a US company operating mostly on US soil so the US government has a vested interest to keep this strategic asset going for as long as possible.
> Apple, Nvidia and US govt can provide the required funds if they have confidence in its ability to deliver.
Given Apple's history with Intel's ability to deliver, I'm guessing the confidence there isn't high.
Are you referring to 5G radio modems or another chip?
Probably Intel’s fumble when Apple asked them for better performance per watt for the laptop CPUs and whether they wanted the iPhone CPU business back in 2006.
A more recent motivation might be Apple's switch to in-house ARM for MacOS for similar reasons.
Probably the Intel CPUs in Macbooks before Apple made the push for the M1 - circa the Intel quad core era where their laptop chips had major heat issues... ~2012 IIRC?
Amazon and Google probably as well?
Random question: where did the ARM core design come from?
originally the MOS Technology 6502 :
https://en.wikipedia.org/wiki/ARM_architecture_family#Histor...
it's an interesting article
a bit of a stretch
Intel's first exposure was the purchase of DEC StrongARM in the 90s, although that particular product line was sold to Marvel.
Probably directly from Arm? https://www.intc.com/news-events/press-releases/detail/1614/...
Intel are believed to hold an Arm architectural license [1] as far as I know, they have made Arm-based things in the past.
[1]: https://en.wikipedia.org/wiki/ARM_architecture_family#Archit...
If they didn’t have one already they would have presumably acquired one when they bought Altera - they had SoC FPGAs that have ARM cores hooked up to an FPGA fabric.
They have since spun off Altera but I imagine they’d still have a license.
why only apple and Nvidia are left buying from foundries. is the market for cpu/gpu that bad? zero innovation and other players even in niche markets?
Have you heard of AMD? You know... the company with about 25% of CPU market share (at least in PCs) these days?
https://www.tomshardware.com/pc-components/cpus/amds-desktop...
They have 17% overall according to this chart which includes Apple.
https://www.accio.com/business/best-selling-cpus
Read up on this young startup, as I think they are going places!
Intel pays for TSMC to produce their chips as well
It should be RISC-V... who is in charge at Intel??
Is this related to the rumors of softbank (ARM) money injection in Intel?
I don't think Intel plans to make a product, but to prove they can build a working chip that's not one of their own design. Being ARM has fewer developmental risks than a RISC-V design and make validation easier.
From the article:
Why is Intel manufacturing an Arm SoC as a reference platform? Probably because it's trying to attract external customers, and there's a whole lot more companies building Arm SoCs than there are firms pitching x86-64 processors.
They're not trying to build the next best thing. They're trying to attract customers.
>It should be RISC-V... who is in charge at Intel??
Why should it be that? What are your arguments?
Very unlikely to happen but Intel could release an Arm chip with native x86 translation. Arm and AMD IP would be needed but this would be the best chip for Windows
I don't understand what the difference is between "an ARM chip with native x86 translation" and a dual-ISA x86 and ARM chip.
And I don't understand why you'd want a dual-ISA x86 and ARM rather than just an x86 chip. You wouldn't get whatever CPU front-end simplicity advantages there are from ARM, since your front-end would get significantly more complex and consume significantly more transistors than with a normal x86 chip. And I don't think there's a market of people who want ARM for compatibility reason; any Windows software which supports ARM also supports x86.
What they could do is to release an ARM chip with a slightly extended ISA to add the select features which are difficult to emulate in software, such as loads and stores with the memory ordering guarantees x86 provides but ARM doesn't. Apple does this AFAIK, and it's one part of why Rosetta 2 is so good. But any ARM CPU maker could do this.
I wonder if ARM instructions could be translated to Intel’s uOps. Then everything except that translation could be shared. And, since programs consist entirely of one type of instruction for the most part, we could imagine that the chip should be able to stick to just doing one type of translation for the duration of a program run, rather than having to figure it out for each instruction.
I’m not saying I want this, but it might be surprisingly not totally impractical.
Fujitsu and Nvidia also implement (at least) TSO.
https://threedots.ovh/blog/2021/02/cpus-with-sequential-cons...
Denver does it because it was supposed to be an x86 CPU, but they couldn't get an agreement with Intel for patent licensing, so they pivoted into being the first available aarch64 CPU since decode was happening entirely in software.
> I don't understand what the difference is between "an ARM chip with native x86 translation" and a dual-ISA x86 and ARM chip.
Look at Apple's Rosetta 2 for an example. M-series Apple Silicon has special undocumented modes that mirror x86 architectural quirks that don't usually exist in ARM, in order to support AOT-translated machine code. The chip doesn't support x86 instructions, but it has the amenities to support x86 code. That could be what "native x86 translation" meant?
That's what I suggested in my comment's last paragraph. I don't think that counts as "an ARM chip with native x86 translation", but really the only person who can say whether that's what dlojudice meant is dlojudice.
And why wouldn’t Intel be capable of doing the same?
I never said that?