OpenAI, the creator of hugely popular ChatGPT, could be looking at creating its own chipsets to power its AI-driven infrastructure. Here's what we know so far.News 

OpenAI Could Develop AI-Powered Chipsets of Its Own in the Future: Report

OpenAI, the company behind Microsoft-backed ChatGPT, is reportedly considering making its own AI chipsets and has even set a potential acquisition target.

According to Reuters, the company is still in the planning phase – and currently has not decided to proceed with the change. But the company has been hampered by the expensive chipsets it needs for its AI-based ambitions, and has sought to address the shortcomings of said chips.

To facilitate this, Reuters notes that OpenAI is considering collaborations with giants such as NVIDIA and may even diversify in the future. Additionally, it must be acknowledged that OpenAI CEO Sam Altman has previously openly stated that the company is having a hard time acquiring the GPUs needed to run its infrastructure. But right now NVIDIA dominates the market.

The alleged move may also help OpenAI reduce its operating costs. In particular, using ChatGPT is very expensive for OpenAI. “Each query costs about 4 cents, according to an analysis by Bernstein analyst Stacy Rasgon. If ChatGPT queries grow to one-tenth the scale of a Google search, it would require about $48.1 billion worth of GPUs to start with and about $16 billion in chips per year to keep up in action, Reuters reports.

Earlier this year, SemiAnalysis told The Information that OpenAI can consume up to $7,00,000 a day. Also, as the company gradually mainstreams GPT-4, moving away from GPT 3.5 LLM—the currently freely available version that ChatGPT uses—could make things even more expensive for OpenAI.

So with these alleged possibilities, it could be a natural move for OpenAI to pursue the decision. But Reuters points out that it could take years for this to happen.

Related posts

Leave a Comment