This is how the future starts — not with a bang, but with a glitch.
In what hopefully isn’t a sign of Elon Musk’s artificial intelligence (AI) startup xAI’s future roadmap, the company’s Twitter Spaces introduction to the general public on Friday (July 14) was delayed for around 10 minutes by technical difficulties.
“We need to tweak the algorithm for higher immediacy,” Musk explained to the more than 40,000 listeners hoping to learn more about the latest entrant in the AI ecosystem.
Prophetic? Only time will tell.
But the eccentric billionaire didn’t take long before digging into the goals behind his latest venture, which he launched to allegedly understand reality.
“An AI powered by 10,000 GPUs can still not write a better novel than a human using just 10 watts of brain power — that’s a six order of magnitude difference, where two orders of magnitude can be accounted for by the difference between a synapse and a transformer, but what are the other 4?” Musk said.
Gaining clarity around those other 4, he explained, is what his new company hopes to find out: by solving for both the very notion and replicable nature of intelligence, to effectively solve for everything downstream of the (for-now?) entirely human capability.
It is a goal as lofty and ambitious as it is undefined.
The company is led by Musk and includes 11 AI researchers, one of whom used to be a dubstep DJ.
Read Also: It’s a ’90s Browser War Redux as Musk and Meta Enter AI Race
But despite the musical ambitions of their earlier lives, xAI’s small and the apparently all-male team has already contributed some of the most widely used methods in the AI field, including leading the development of some of its largest breakthroughs.
That’s why Musk said the startup is determined to remain “relatively small, with the best people.”
Around half of them previously worked at Google, and others are drawn from OpenAI, Microsoft Research, Tesla and the University of Toronto.
As PYMNTS reported, Google has been suffering a slow but steady exodus of some of its most talented researchers and scientists, including all eight of the employees who created the groundbreaking Transformer technique that made OpenAI’s ChatGPT and nearly all of today’s AI models possible.
Musk helped co-found OpenAI in 2015 and spoke on the Spaces discussion of its founding as a foil to Google’s AI ambitions. He left the organization in 2018 and has been increasingly critical of the business since, particularly as it tied its fortunes to Microsoft — another subject he touched on during the conversation.
OpenAI’s current CEO Sam Altman has already said that new ideas, not bigger models, will be what evolves AI beyond its present capabilities and pushes the still-nascent industry forward.
Musk, bad blood or not, agrees.
He admitted during the discussion that while he sees xAI in direct competition with larger businesses, including OpenAI, Microsoft, Alphabet and Meta, as well as hopeful juggernaut upstarts like Anthropic, his firm is taking a different approach to establishing its foundation model.
“AGI (artificial general intelligence) being brute forced is not succeeding,” he said, claiming that while “xAI is not trying to solve AGI on a laptop, [and] there will be heavy compute,” his team will have free reign to explore ideas other than scaling up the foundational model’s data parameters.
After all, distinct and competitive business models, at least in the AI ecosystem, are born from variations in the architecture of an AI platform’s foundational model.
Read Also: Peeking Under the Hood of AI’s High-Octane Technical Needs
While Musk has no shortage of ambition for his AI platform, he did emphasize that he sees a crucial shortage of key materials hamstringing the AI sector in the coming years, from silicon to GPUs to employee talent.
It was reported in April that Musk was quietly amassing the processors and engineers to create AI tech to rival larger players.
He demurred when asked about the specifics of any synergies xAI will enjoy between the resources of Musk’s other companies, including Twitter and Tesla, although xAI’s website says it will work closely with them.
Twitter data comprises the platform’s history of conversations between users and is particularly well suited for training large language models (LLMs).
“Every LLM company is already stealing Twitter’s data,” Musk said.
As far as regulating AI goes, Musk emphasized a need for a “refree,” ideally an international one.
He suggested that some kind of industry group, or self-regulatory body, similar to the Motion Picture Association would be useful — while admitting that xAI’s legal department was currently “zero.”
While generative AI is a powerful tool, the technology’s applications boil down to the same broad buckets: research co-pilot, writing assistant or task-focused enterprise enhancement solution.
It will be interesting to see what xAI can add to the ecosystem.
“Pace of innovation is the competitive advantage here, not what has been created so far,” Musk said. “That’s the defense against competition.”