By Max A. Cherney SAN FRANCISCO, Feb 19 (Reuters) – Toronto-based chip startup Taalas said on Thursday it had raised $169 million and has developed a chip capable of running artificial intelligence applications faster and more cheaply than conventional approaches. Taalas has raised a total of $219 million from investors such as Quiet Capital, Fidelity […]
Science
Chip startup Taalas raises $169 million to help build AI chips to take on Nvidia
Audio By Carbonatix
By Max A. Cherney
SAN FRANCISCO, Feb 19 (Reuters) – Toronto-based chip startup Taalas said on Thursday it had raised $169 million and has developed a chip capable of running artificial intelligence applications faster and more cheaply than conventional approaches.
Taalas has raised a total of $219 million from investors such as Quiet Capital, Fidelity and Pierre Lamond, a chip industry venture capitalist.
Taalas’ announcement arrives weeks after Nvidia’s deal to license intellectual property from chip startup Groq for $20 billion, which reignited interest in a crop of startups and technologies used to perform specific elements of AI inference, the process where an AI model, such as the one powering OpenAI’s ChatGPT, responds to user queries.
Taalas’ approach to chip design involves printing portions of an AI model onto a piece of silicon, effectively producing a custom chip suited for specific models such as a small version of Meta’s Llama model. The customized silicon is paired with large amounts of speedy but costly on-chip memory called static random-access memory (SRAM), which is similar to Groq’s design.
It is the bespoke design for each model that gives the Taalas chip its advantage, CEO Ljubisa Bajic told Reuters in an interview.
“This hardwiring is partly what gives us the speed,” he said.
The startup assembles a nearly complete chip, which has roughly 100 layers, and then performs the final customization on two of the metal layers, Bajic said. It takes TSMC, which Taalas uses for manufacturing, about two months to complete fabrication of a chip customized for a particular model, he said.
It takes roughly six months to fabricate an AI processor such as Nvidia’s Blackwell.
Taalas said it can produce chips capable of running less sophisticated models now and has plans to build a processor capable of deploying a cutting-edge model, such as GPT-5.2, by the end of this year.
Groq’s first generation of processor used an SRAM-heavy approach to its chip design, as does another startup, Cerebras, which signed a cloud computing deal with OpenAI in January. Startup d-Matrix also uses a similar design.
(Reporting by Max A. Cherney in San Francisco; Editing by Edward Tobin and Lisa Shumaker)
