Image: Intel

Some people are starting to think that Intel’s dGPU group might be getting the axe.

Per a recent article shared by analyst Jon Peddie, it’s being estimated that Intel’s venture into discrete GPUs and graphics cards have already cost the company $3.5 billion—or possibly more. Peddie pointed out that Intel started reporting on its dGPU group (known as AXG or accelerated graphics) back in Q1 2021, effectively confirming “staggering” losses of $2.1 billion, but apparently, the company has “actually invested more than that,” according to the analyst behind Jon Peddie Research.

Peddie seems pretty convinced that Intel CEO Pat Gelsinger is being tempted to kill the dGPU division, with him being an executive who isn’t afraid to strip away “non-essential business units” (e.g., Optane) in order to save in operating costs and losses.

Should Intel dump its AXG group? Probably. The company started the project six years ago. Since then, AMD and Nvidia have brought out three generations of new and stunningly powerful dGPUs, and more are in the pipeline. Four new companies have started up in China, and two new ones announced in the US. Intel is now facing a much stronger AMD and Nvidia, plus six start-ups—the rules of engagement have dramatically changed while Intel sunk money into projects it can’t seem to get off the ground.

Not many CEOs would put up with that, especially while repairing their company from previous misguided investments. Gelsinger was brought in to clean things up and get back to the company’s core strengths. The dGPU program is noble in its concept, intriguing in its alleged design, and an adventure too great for even Intel, especially in these days of recovery.

Peddie also mentioned that Intel has had “little to show” since the inception of its dGPU group, and while that seems like an exaggeration based on the recent release of its first desktop graphics cards in select markets, many might agree that Intel’s rollout of Arc could have been far better. It’s still unclear when certain Arc products might hit the U.S. market.

The analyst recommends that the best thing Intel could do at this point would be to sell its dGPU group off, possibly “dressed up as a strategic move,” to distance itself from its “embarrassment.”

Source: JPR

Go to thread

Don’t Miss Out on More FPS Review Content!

Our weekly newsletter includes a recap of our reviews and a run down of the most popular tech news that we published.

11 comments

  1. Most pundits seem torn.

    Half of them see Intel ready to punt on this. It's been a massive money sink, they aren't ready for market, and they haven't found the light at the end of the tunnel. They just canned optane, they have been selling side-projects off left and right, and they really have been trying to focus on their core business -- processors and their foundries. They have made progress, but haven't exactly righted the ship in their core business, so anything that isn't directly related to getting that back on track needs to go. The fact that they have to send products to outside foundries for manufacture is a huge black eye, especially since process node advantage was one of the key factors in Intel dominance of their other market segments for so long.

    The other half see it as something Intel can't afford to pass up. Intel missed the boat on ARM and the small, energy efficiency processors (phones, IOT, etc). Almost everything points to the extremely parallel nature of graphics processors as being key to ever-expanding data centers ,AI applications, cypto, and other emerging markets. Intel needs a product like this to stay competitive and relevant in the 10-year horizon. It may be bleeding money now, but to give up on it means relegating the company to irrelevance inside of the next decade. A shrinking desktop/laptop processor market (apart from the temporary uptick due to covid) and a more competitive AMD in the data center space only add fuel to the need to diversify; the fact that their main rival already has a foothold in this space (AMD) even more so.

    I guess it's a matter of which way the Board and Shareholders feel about it.

    Personally, speaking as a non-direct shareholder (I probably own some faction of a share through an index fund somewhere if I dug, but I don't consider myself a shareholder) -- and as a largely ignorant armchair quarterback: I think Intel should spin the foundry off, focus on chip design, and invest significantly in the DGX. I also think Raja is a poison to that division. The focus shouldn't be on gaming, that is how ATI/AMD and nVidia happened to get there, but the gaming market isn't the growth driver. Data centers, AI, Crypto (i want to gag typing that, but I can't deny the impact of when those bubbles hit), HPC, and edge computing (cars, etc) are the explosive growth markets; gaming is just a glitzy high profile niche. The gaming thing is just serving as an unnecessary distraction, I'm not saying they need to can a gaming-oriented SKU, but it should not be the focus of the development efforts, and most of your news cycle should not be eaten up with how crappy your product is compared to other gaming products.
  2. Most pundits seem torn.

    Half of them see Intel ready to punt on this. It's been a massive money sink, they aren't ready for market, and they haven't found the light at the end of the tunnel. They just canned optane, they have been selling side-projects off left and right, and they really have been trying to focus on their core business -- processors and their foundries. They have made progress, but haven't exactly righted the ship in their core business, so anything that isn't directly related to getting that back on track needs to go. The fact that they have to send products to outside foundries for manufacture is a huge black eye, especially since process node advantage was one of the key factors in Intel dominance of their other market segments for so long.

    The other half see it as something Intel can't afford to pass up. Intel missed the boat on ARM and the small, energy efficiency processors (phones, IOT, etc). Almost everything points to the extremely parallel nature of graphics processors as being key to ever-expanding data centers ,AI applications, cypto, and other emerging markets. Intel needs a product like this to stay competitive and relevant in the 10-year horizon. It may be bleeding money now, but to give up on it means relegating the company to irrelevance inside of the next decade. A shrinking desktop/laptop processor market (apart from the temporary uptick due to covid) and a more competitive AMD in the data center space only add fuel to the need to diversify; the fact that their main rival already has a foothold in this space (AMD) even more so.

    I guess it's a matter of which way the Board and Shareholders feel about it.

    Personally, speaking as a non-direct shareholder (I probably own some faction of a share through an index fund somewhere if I dug, but I don't consider myself a shareholder) -- and as a largely ignorant armchair quarterback: I think Intel should spin the foundry off, focus on chip design, and invest significantly in the DGX. I also think Raja is a poison to that division. The focus shouldn't be on gaming, that is how ATI/AMD and nVidia happened to get there, but the gaming market isn't the growth driver. Data centers, AI, Crypto (i want to gag typing that, but I can't deny the impact of when those bubbles hit), HPC, and edge computing (cars, etc) are the explosive growth markets; gaming is just a glitzy high profile niche. The gaming thing is just serving as an unnecessary distraction, I'm not saying they need to can a gaming-oriented SKU, but it should not be the focus of the development efforts, and most of your news cycle should not be eaten up with how crappy your product is compared to other gaming products.

    Honestly if they make an amazing card for virtualizing so I can run 50 guests or more on a card and give them SOME form of 3d acceleration for the OS then I am happy. AND doing that well will of course lead into gaming naturally. They two co exist. If you keep it pure as some sort of enhanced math co-processor that's a different story.

    It isn't really GPU's that are what everyone is loving about these devices. It's the programable silicone, hundreds or thousands of processors you can effectively write generic code to do whatever the heck you want. This is the big reason that you can mine with them. This is why you can use them to scrub sound or video, this is why you can use them for AI based tasks be that iteration or I dunno.. something else. (I need to bone up on AI it seems.)

    So yes we want competition, and YES this is something intel should be able to do well. But they are not really making this for US the enthusiasts. They are making this to have a design that can be used for MANY roles without having to create specific net new parts. In that theme the cost of a few billion, as long as the COMPANY isn't in the red they should keep up the work. Because in the end the entity in control of the best programmable silicone WILL be the winner.

    Now if they REALLY want to win... make us tools for IT based enthusiasts to be able to slice up our OWN GPU's to do whatever we want with in VM's. They won't... the enterprise licensing dollars are TOO lucrative.

    You want to know what I think will happen. Here it is. Intel will continue to flounder, AMD and Nvidia will rule the roost. Right until Microsoft comes in and buys Intel to compete directly with Apple making their own chips. Because Intel doesn't have the resources, is stretched thin, and has a VAST array of under utilized expertise and facilities. And the government won't block it because MS will be just making themselves a more complete entity in comparison to Apple and Amazon. (Don't forget Amazon has their own CPU line and you can bet they are getting into programmable silicone.)

    So yea right now Intel is doing some things that look odd but they NEED to make the gamble on programmable CPU's. They are missing out and all of their direct competitors have them. (Well maybe not apple... I don't know enough about their M series of chips to comment.)
  3. I have just seen their PR interview on pc world about their GPU's, they seem to be confident but hunble in what needs to be done and already admitted to making some mistakes. From the sounds of it, the launch seems imminent with their biggest issue atm the drivers and compatability with DX 9 and DX 11 games wich is going to be a long and ongoing process for them to catch up on.

    They also seem to be going for performance for the price, I still think they can succeed in launching something useful and somewhat succesfull but it might be more aimed toward integrators then DIY market atm but this could change with future chips.

    I think GPU's are too important in certain area's for them to abandon it, I can see them working towards proffesional machines whare getting a full intel machine can even get you better performance but time will tell.
  4. They can't be me too in value and expect success. They should aim for razor thin margins and give much better value, for many years. AMD has been doing this with consoles for example and they are still shaky in that sector, its hard in the world of computing. As far as AMD goes with dgpus, I think they realize they simply can't beat nvidia, and decided to be content with the crumbs left by nvidia , hence why they don't price razor thin. But being 3rd in line if you price at a me too level, you get crumbs of crumbs, and it will never be enough. Intel should plan for breakeven at best for years, and grow the tech division that way, it could be important and huge, but it would take years.
  5. I could have saved them a ton of money by letting them know sh1t was gonna get f*cked as soon as they hired Raja. Hiring him was basically setting all their money on fire, throwing it in the toilet, then flushing it down.
  6. I could have saved them a ton of money by letting them know sh1t was gonna get f*cked as soon as they hired Raja. Hiring him was basically setting all their money on fire, throwing it in the toilet, then flushing it down.

    And yet AMD is doing just fine with the GPU's that were developped under Raja, he is no Jim Keller but seems somewhat capable nonetheless.
  7. And yet AMD is doing just fine with the GPU's that were developped under Raja, he is no Jim Keller but seems somewhat capable nonetheless.
    I don’t know that I would make that claim.

    Raja’s baby was a high clocked Vega with HBM.

    It made for a great integrated graphics chip, eventually. It was a horrible discrete chip, and AMD spent an entire generation without any higher end cards and only Polaris because it was so poorly utilized and imagined
  8. I don’t know that I would make that claim.

    Raja’s baby was a high clocked Vega with HBM.

    It made for a great integrated graphics chip, eventually. It was a horrible discrete chip, and AMD spent an entire generation without any higher end cards and only Polaris because it was so poorly utilized and imagined
    Companies work on multiple generations at once, current cards are likely still based on designs from his time at the company, even if it would be hard to prove how much of that is still his vision.

Leave a comment

Please log in to your forum account to comment