You probably wouldn’t have thought much of the DoubleTree hotel off highway 101 in San Jose, California.
The hotel was comfortable, but not luxurious. And while the ball room could accommodate a few hundred guests, it was a far cry from the stadium-like event rooms you’d see in Las Vegas.
But yesterday, in this modest hotel, some of the most important discussions around the future of artificial intelligence were taking place.
Yesterday, I attended the Andes RISC-V Con 2023. It’s a free 1-day conference on AI, data center and open-source software.
I’m a firm believer in “boots on the ground” research. After all, there’s only so much you can learn from sitting behind the desk. And believe it or not, I actually prefer modest little conferences like these.
The big, flashy events in Las Vegas like CES by the Consumer Technology Association tend to get all of the attention. But in my experience, they’re often treated more like parties, not industry conferences. The best conversations are happening at these small, out-of-the-way conferences.
After all, if you’re willing to travel to a nondescript hotel on a Tuesday with no afterparty or casinos, you must really care about the technology. Today, I’d like to share some of the insights I gleaned from the event.
One of the speakers was Raja Koduri who’s spent time working at Intel, AMD, and Apple. His message was clear.
The personal computing era was x86. In other words, Intel and AMD’s use of the x86 means that nine out of ten PCs are using this design. But the x86 has its limitations.
As for mobile, Arm chips were used for their balance between power and energy usage.
But one of the most important things I took away was that the AI era will belong to RISC-V (pronounced “risk-five”).
RISC-V is an open-source instruction set for computer chips developed at University of California Berkeley in 2010. It’s free to use and expand upon.
It’s an alternative to x86 controlled by Intel and AMD and Arm which is found in 95% of smart phones. You might remember, I wrote about Arm in detail yesterday.
That leaves large chip makers in control of the tech hardware ecosystem. They charge a license fee for access to their tech.
This is great for the shareholders of chip makers. But it can slow the pace of innovation.
Raja Koduri believes that’s holding AI development back. X86, Arm, and Nvidia operate in what he called “a black box.” In other words, all the developments happen behind closed doors.
RISC-V developers are aiming to change this by making chip designs open source. And some speakers at the conference felt it was going to change soon.
Jim Keller has worked at AMD, Tesla, and Apple. He sees RISC-V taking over the market in the next five to ten years.
For that to happen, RISC-V needs to be used in personal computers, mobile phones, and data center servers.
The only thing holding RISC-V from these markets was software.
The primary reason close ecosystems like x86 and Arm continue to thrive is largely because the software stack is well developed.
Open-source projects need community support to build the software to support it. Up until recently, the community was limited to tech enthusiasts toiling away in their spare time.
But RISC-V recently found corporate support…
Historically, if a device maker wanted to build an Android device, like a smartphone or computer, it had to buy a hardware license from Intel or Arm.
In January, Google announced that RISC-V will be supported by Android. Google will be building specific software code for Android developers. Once completed, it will reduce the time it takes to develop applications that run on RISC-V.
That means device makers won’t be limited by x86 or Arm designs… and paying their licensing fees.
Meta is making similar moves. In May, it announced a custom accelerator chip, the MTIA v1, based on RISC-V.
The MTIA v1 handles AI tasks such as content understanding and ad ranking.
Here’s what I’d ask us to understand: future development of artificial intelligence will not happen exclusively behind the “walled garden” of the largest tech companies. In some ways, the open-source nature of RISC-V is reminiscent of the development of the internet.
It was only because of freely available protocols like HTTP that the developers were able to build out the capabilities and create the modern internet. If this protocol had been “locked away” behind some massive corporation, adoption would have been years slower. That’s what has me most excited for artificial intelligence.
RISC-V’s will give smaller developers and device makers the chance to build on innovative ideas.
This won’t happen overnight. But Cerebras CEO Andrew Feldman said it best, “You don’t beat the big guy in one big bite. You iterate slowly and catch up.”
Future innovations for artificial intelligence won’t come in decades. They will be measured in years and even months. This tells me that the investment opportunity in artificial intelligence is not only massive, but urgent as well.
But that’s not all I learned from the conference. Be sure you tune in tomorrow and I’ll share the three biggest threats to Nvidia’s AI dominance.
Regards,
Colin Tedards
Editor, The Bleeding Edge
Like what you’re reading? Send your thoughts to feedback@brownstoneresearch.com.
The Bleeding Edge is the only free newsletter that delivers daily insights and information from the high-tech world as well as topics and trends relevant to investments.
The Bleeding Edge is the only free newsletter that delivers daily insights and information from the high-tech world as well as topics and trends relevant to investments.