Nvidia CEO Jensen Huang showed up at a gathering of artificial intelligence researchers in Long Beach, Calif. this week with a couple of big surprises.
One was an orchestral piece inspired by music from the Star Wars movies, but composed by an AI program from Belgian startup AIVA that—of course—relies on Nvidia chips. The music went over big with the crowd of AI geeks attending the Neural Information Processing Systems Conference, known as NIPS, including some giants in the field like Nicholas Pinto, head of deep learning at Apple, and Yann LeCun, director of AI Research at Facebook.
LeCun was quoted saying the Star Wars bit was “a nice surprise.”
Huang’s other surprise was a bit more practical, and showed just how competitive the AI chip market niche has become. Analysts say sales of specialized microprocessors for use with AI programs like machine learning and image recognition will grow astronomically from about $500 million last year to $20 billion to $30 billion within five years. As the kinds of graphics chips that were first popular with video gamers have turned out to be among the best-suited for AI programming, too, Nvidia is in the lead, chased by Intel intc , Advanced Micro Devices, and a host of others. Intel is developing AI-focused chips from its 2016 acquisition of Nervana Systems, while AMD has an entirely new chip design tuned for AI apps called Vega on the market this year.
Hoping to stay at the front of the pack, Huang had a new graphics card called the Titan V to show off in Long Beach. Suitable for installing in ordinary PCs, the Titan V contains Nvidia’s latest “Volta” chip design with some 21 billion individual transistors. Priced at $3,000, the Titan V comes just eight months after Nvidia unveiled its Titan Xp design based on its earlier “Pascal” chips. The new card is a little more than double the price but offers up to nine times faster performance on major AI apps like Google’s TensorFlow software, Amazon’s MXNet and Facebook’s Caffe 2, Nvidia said. The older Nvidia card competed head on with AMD’s similarly priced Vega Frontier Edition, but the Titan V may prompt the need for a higher-priced, higher-performing AMD countermove if it catches on.
Get Data Sheet, Fortune’s technology newsletter.
Chip industry analyst Patrick Moorhead, president of Moor Insights & Strategy, said the Titan V offered an “impressive value proposition,” as in some ways it could keep up with the performance of Nvidia’s highest-end Tesla V100 card that sells for $10,000 and is aimed at server computers, especially in cloud data centers. Both cards include “Volta” chips and crunch AI calculations at almost the same rate. But the more expensive card has the capability to exchange data with other, surrounding chips at a much faster rate, making it a better choice for servers that often contain four or more graphics cards.
That’s a pretty clear market distinction, according to Moorhead. The Titan V could go in a high-powered PC used by an AI scientist on a local project, while the Tesla V100 will be good for servers that handle the immense app workloads that power the voice recognition abilities of digital assistants from Google or Amazon, for example. “I don’t expect a lot of cannibalization as Tesla is more for production workloads and Titan V is more for research,” Moorhead said.
Nvidia nvda wasn’t the only company making news at the NIPS conference. Tesla tsla founder Elon Musk was reportedly hanging around, too. At one after-hours event, Musk said Tesla was developing its own custom AI chips, tech news site The Register reported. Musk last year hired chip design legend Jim Keller, who had been at AMD amd for years, sparking speculation that the electric carmaker would go that route to enhance the self-driving capability of future vehicles.
“Jim is developing specialized AI hardware that we think will be the best in the world,” Musk said somewhat cryptically at the party, according to the tech news site.
(Update: This story was updated on Dec. 8 to correct that the Titan V card speeds up performance of major AI apps, not just Google’s TensorFlow.)