In the future, ChatGPT’s AI could be powered by its own chips. According to Reuters, OpenAI is currently exploring the possibility of making its own chips to run artificial intelligence and is even considering an acquisition.
Besides the fact that OpenAI could finally solve the lack of GPUs, using its own chips could allow it to better manage the costs associated with running the service. Based on an analysis by Stacy Rasgon from Bernstein Research, each ChatGPT user query costs the company approximately 4 cents (approximately 1 CZK). The service gained over 100 million users in its first two months, which means millions of queries per day (although it lost users for the first time in July). Rasgon said that if ChatGPT received only a tenth of the queries of Google, it would need $48 billion worth of GPUs and will spend $16 billion a year on chips going forward.
NVIDIA currently dominates the market for chips designed for artificial intelligence applications – for example, the Microsoft supercomputer that OpenAI used to develop its technology uses 10,000 NVIDIA GPUs. This is also why other companies decided to start developing their own processors. According to The Information server, Microsoft has been working on its own chip for artificial intelligence since 2019. The product is codenamed Athena, and OpenAI is said to be testing the technology.
OpenAI has not yet decided whether to implement its plans, Reuters reports. And even if he decides to implement the plans, it could be years before he starts using his own chips.
Source: www.engadget.com