© Reuters. FILE PHOTO: Silicon Valley AI computing startup SambaNova’s DataScale system is shown at the U.S. Department of Energy’s Argonne National Laboratory, in Lemont, Illinois, U.S., November 10, 2022. Argonne National Laboratory/Handout via REUTERS/File Ph
By Max A. Cherney
(Reuters) – Artificial intelligence chip startup SambaNova Systems announced a new semiconductor on Tuesday, designed to allow its customers to use higher quality AI models at a lower overall cost.
The SN40L chip is designed to run AI models that are more than twice the size that the advanced version of OpenAI’s ChatGPT is said to use, said the Palo Alto, California, company.
“SN40L is specifically built for large language models running enterprise applications,” SambaNova CEO Rodrigo Liang said. “We’ve built a full stack that has allowed us to really understand the enterprise use case really well.”
Big businesses that are looking to deploy AI in novel ways face a different and more complex set of considerations than consumer software like ChatGPT, Liang said.
Security, accuracy and privacy are all areas that AI technology must be designed differently to be useful for enterprise customers.
Nvidia (NASDAQ:) dominates the market for AI chips, but a surge in demand triggered by interest in generative AI software made the coveted chips difficult for some companies to obtain. Intel (NASDAQ:), Advanced Micro Devices (NASDAQ:) and startups like SambaNova have moved to fill the void.
The new SambaNova chip is capable of powering a 5 trillion parameter model, and includes two advanced forms of memory. Memory can sometimes be a bottleneck to crunching AI data. The company said that its combination of hardware enables customers to run larger AI models without trading size for accuracy.
Taiwan Semiconductor Manufacturing Company manufactures the chip for SambaNova.
(This story has been corrected to fix the spelling of SambaNova in paragraph 7)
Read the full article here