IBM and AMD to work on quantum-centric supercomputing

(newsroom.ibm.com)

56 points | by donutloop 2 days ago ago

31 comments

  • cs702 2 days ago ago

    > Today, IBM (NYSE: IBM) and AMD (NASDAQ: AMD) announced plans to develop next-generation computing architectures based on the combination of quantum computers and high-performance computing, known as quantum-centric supercomputing. AMD and IBM are collaborating to develop scalable, open-source platforms that could redefine the future of computing, leveraging IBM's leadership in developing the world's most performant quantum computers and software, and AMD's leadership in high-performance computing and AI accelerators.

    Translation from corporate-speak: "Maybe we can chip away at Nvidia's dominance by working together and promising something Nvidia can't offer?"

    As I understand things, firing up a bunch of GPUs is still more cost-effective than any quantum computer available right now.

    Nonetheless, I wish IBM and AMD lots of success. It would be nice if Nvidia gets real competition!

    • esseph 2 days ago ago

      .. I haven't heard about Nvidia working on quantum anything?

      This seems very different, more of a leap forward but in a different direction.

      Note: Start moving to quantum-resistant algorithms now.

      • cs702 2 days ago ago

        The announcement is meant to make corporate buyers believe that IBM's quantum magic + AMD's chips could somehow leapfrog Nvidia's GPUs, hoping to slow down its sales.

        I never wrote Nvidia is working on quantum anything.

        • esseph 2 days ago ago

          It sounds like are intended to be products used to solve different problems but in a similar space. At least, if IBM is trying to parley their quantum advances. Different customers?

  • jzelinskie 2 days ago ago

    After reading about the recommendation system breakthrough[1], I'm more curious about just how much we're leaving on the table with classical algorithms. If you raised the amount of money being funneled into quantum computing and spent it purely funding classical algorithm research, would you be better off?

    [1]: https://www.quantamagazine.org/teenager-finds-classical-alte...

    • mrbungie 2 days ago ago

      Shhh, big tech needs a new dragon to chase iff GenAI stops being shiny.

      Non-cynic take: exploration-based endeavors still may end up in useful developments.

  • Jlagreen 2 days ago ago

    So, it's basically trying to catch up again: https://www.nvidia.com/en-us/solutions/quantum-computing/

    Or some interesting news here: https://nvidianews.nvidia.com/news/nvidia-powers-worlds-larg...

    At least, they aren't waiting until CUDA quantum becomes as large as CUDA for GPUs.

  • cjs_ac 2 days ago ago

    > YORKTOWN HEIGHTS, N.Y. and AUSTIN, Texas, Aug. 26, 2025 /PRNewswire/ -- Today, IBM (NYSE: IBM) and AMD (NASDAQ: AMD) announced plans to develop next-generation computing architectures based on the combination of quantum computers and high-performance computing, known as quantum-centric supercomputing. AMD and IBM are collaborating to develop scalable, open-source platforms that could redefine the future of computing, leveraging IBM's leadership in developing the world's most performant quantum computers and software, and AMD's leadership in high-performance computing and AI accelerators.

    Has anyone found a real-world problem that's best solved by a quantum computer that isn't cryptography? I exclude cryptography because if the only thing these machines are good for is breaking ciphers, then governments won't let anyone else buy one, will they?

    • dijit 2 days ago ago

      Materials Science and Drug Discovery would suddenly become a lot easier, along with financial modelling (of our entire society possibly) and logistics/supply chains.

      They would also be much better at training ML and doing pattern recognition.

      Basically anything that requires a massively parallel computation on undeterminable states that are only clear in hindsight. They’re really important actually and its only an unfortunate side-effect that the same solution breaks all our cryptography.

      (of course: the offensive wings of our defence ministries really enjoy that side-effect)

      • kevinventullo 2 days ago ago

        Basically anything that requires a massively parallel computation on undeterminable states that are only clear in hindsight.

        From https://scottaaronson.blog/ :

        “If you take nothing else from this blog: quantum computers won't solve hard problems instantly by just trying all solutions in parallel.”

      • atq2119 2 days ago ago

        > Basically anything that requires a massively parallel computation on undeterminable states that are only clear in hindsight.

        If only. This description makes it sound as if quantum computers could help efficiently solve all problems in NP, which is not believed to be true.

        Those "undeterminable" states need some non-trivial algebraic structure so that destructive interference of states can do its magic in a quantum computer. Finding such a structure is incredibly difficult, if it exists at all.

      • mcmcmc 2 days ago ago

        Better financial modeling? Oh boy, who’s ready for quantum dynamic pricing to really squeeze your wallet to the max

  • orionblastar 2 days ago ago

    This reminds me of when IBM, Motorola, and Apple did PowerPC.

    • beeflet 2 days ago ago

      IBM is still doing POWER. I have one running on my desk

  • heisgone 2 days ago ago

    Is IBM still a serious player? It seems to me they are living out of legacy software. Is there quantum computer stuff promising?

    • wmf 2 days ago ago

      IBM really is a leader in quantum computing but it's not yet clear whether that field will ever have real-world value.

      • gentooflux 2 days ago ago

        And if it does, that value will likely fall somewhere between 0 and 1.

        • bee_rider 2 days ago ago

          I think it’ll be exactly one, or exactly zero. That’s the quantum part, right?

    • zipy124 2 days ago ago

      IBM is still a serious player in research, just not in actual output. They make a substantial sum from IP licensing/consulting to the more modern players.

      They do for example a large amount of foundational silicon research that other fabs can license.

      Industry analysts like Dr. Ian Curtis are good to keep tabs on this space such as IBMs 2nm wafers https://youtu.be/PpBagorVtC0?si=CB8-2e3KbFgZQqFH .

      Or his interview with the IBM research VP of hybrid cloud. https://youtu.be/Dl6oGdLhCvk?si=RZ8KLUAqSRWcFHDJ .

    • staringback 2 days ago ago

      What do you mean? IBM is the world leader of AI development thanks to their cutting edge IBM Watson

      • stirfish 2 days ago ago

        I can't tell if you're being serious. Watson is... fine. You can try out the granite models for yourself.

    • esseph 2 days ago ago

      They are among the best in the world, and they also have a very strong presence in Operational Technology industries like manufacturing.

  • ranger_danger 2 days ago ago

    Is that the world's largest heatsink? Either way it's beautiful.

  • guerrilla 2 days ago ago

    I feel like this is just another publicity stunt.

    • xfactorial 2 days ago ago

      For what i know, IBM R&D is truly a marvel in terms of Innovation, but one thing is “we managed to Create a process to make xyz” and another is, indeed, putting it together at scale.

      5 nanometer was something they worked on, but it was TMCS the one who actually made it happen.

      Perhaps this is a good chance to put to work some of that research using AMD Manufacturing.

      Let’s see how it goes.

    • pryelluw 2 days ago ago

      From IBM?! No …

  • Towaway69 2 days ago ago

    Are they combining forces in an attempt to overtake nvidia in the AI space?

  • solardev 2 days ago ago

    Why does the future of computing sound like we're back in 1999?

    Who's next... Intel and SGI? Rockchip and Cyrix? Nvidia must be positively trembling...

    • Mistletoe 2 days ago ago

      Wait until you see my LLM crunched on my Voodoo card.

      • solardev 2 days ago ago

        I mean, technically Nvidia did buy 3dfx... so you're not far off :)

  • kirito1337 2 days ago ago

    [dead]