2023-10-11

Harnessing the RISC-V Wave:
The Future is Now

Florian Zaruba | Technical CPU Lead at AXELERA AI

RISC-V is inevitable – it became the mantra of RISC-V, and it’s true. But before we see why that is, let’s step back and discuss what RISC-V is and why we should care.

Back in the early days, computers were large and bulky, taking up entire rooms and making a lot of noise.

It’s incredible to think about how much technology has advanced over the years. Nowadays, we’re surrounded by microprocessors that make our lives easier without us even realizing it. For example, when we use our bank cards to pay at the supermarket, we’re actually using a full computer that encrypts, signs and verifies our transactions. And our phones contain dozens and dozens of processors that make all the conveniences of modern life possible.

What is RISC-V?
RISC-V (Reduced Instruction Set Computers) is an open-source Instruction Set Architecture (ISA). You may have heard of other well-known ISAs such as x86, ARM, Power, and MIPS.

The RISC-V ISA is the common, standardized language between the processor (hardware) and the programs running on it (software). With RISC-V, the processor itself can only do very basic things, such as (conditionally) adding or subtracting two numbers and deciding what to do based on those arithmetic outcomes. It can also repeat those arithmetic operations several times until a certain condition is reached. In fact, there are only 37 instructions in the base. But these are the very basic ingredients needed to describe essentially any problem to the processor.

We call those commands, instructions.
By defining the exact meaning (semantic) and spelling (syntax) of those instructions, you can start to build a common understanding between the processor and the software running on it. You can think of the ISA as a dictionary. The ISA of RISC-V is free for anyone to access here. This is incredibly important because today’s software has become so complex that no single company can manage it alone and a common standard ensures interoperability.

RISC-V came to the right place, by the right people, at the right time
David Patterson, a visionary in the field of computer architecture, spearheaded the development of Reduced Instruction Set Computers (RISC) at Berkeley. The project went through five iterations, which ultimately led to the creation of the RISC-V architecture.

Initially, RISC-V mainly piqued academic curiosity. However, the availability of open-source processor implementations, coupled with rising geopolitical tensions, quickly drew industrial interest as well. From that point, the project experienced a snowball effect gaining momentum and broader adoption.

Open source breeds innovation
The primary advantage of RISC-V lies in its licensing flexibility. Although the specifications for other ISAs are publicly available, legal constraints prevent you from implementing and selling processor hardware unless you have obtained a license for the ISA specification. With RISC-V, this limitation is removed, offering greater freedom for innovation and commercialization.

The limits that most ISAs have reduce their use by smaller innovators. For example, the widely used ISA “x86” allows only a handful of licensees. Arm, another well-known provider of processor IP, has an entirely different business model. They are selling processor IPs that can be licensed and integrated into your product. Only the biggest players in the market are able to own an architectural license that allows them to implement their own processors.

Most notable of these is Apple, which switched its entire product line from Intel x86 processors to its own Arm implementation. Their investment in this endeavor proved to be a wise decision both technically and economically, as it enabled them to distinguish their product line even further and surpass their rivals by a significant margin.

In the past, designing your own processor and utilizing the expansive software ecosystem was thought to be reserved solely for large corporations. However, that’s not entirely true. Alternatives like OpenRISC existed, but they weren’t widely adopted because their software ecosystems were underdeveloped, and they were often considered hobbyist projects.

Another notable benefit of RISC-V is its built-in modularity, which contrasts with ARM’s more rigid structure. The RISC-V architecture is designed from the ground up to be modular, allowing for easier customization and innovation. While ARM tightly controls and protects its ISA specification, making it difficult to introduce your own specialized features, RISC-V offers the flexibility to add custom extensions and your “secret sauce.”

This is particularly advantageous if you aim to target specific market segments and differentiate your product from competitors. Remember those 37 instructions I’ve mentioned? Turns out this is only the very base instruction set. For most applications, you would likely want to include some more specialized instructions such as floating-point support, atomic memory operations, or maybe even scalable vector extensions. You don’t need to, but you can. In the same way, your company could add your own instructions in case those are beneficial for the application in mind. This means you can still leverage the entire software ecosystem that has been and is being built and only add value on top instead of reinventing the wheel. So let’s have a look at some of these advantages.

How can RISC-V help start-ups design their own chips?
For me, the two most important aspects for a hardware-producing start-up are time-to-market and freedom to innovate.

Most silicon/hardware start-ups innovate and try to provide value-add in their niche. The processor is a necessary, boring commodity and usually not the main star of the product. This is good and the way it should be. Otherwise, we would all be selling the same processors.

RISC-V is an absolute game changer for that.

Do you want to quickly try out an integration idea? Prototype something on an FPGA? Before RISC-V this either meant you needed to use some custom FPGA vendor’s ISA (which you couldn’t send to manufacture for your final product) or start negotiations with the processor vendor of your choice. Negotiations are hard for a small silicon start-up. With RISC-V you can go and grab some core from the internet, do your trials, and start building your software stack. Once happy with the prototype and you want to move forward, there are a plethora of RISC-V IP vendors you can choose from. Choice and market competition are a good thing.

Expanding on the points raised earlier, the freedom to innovate with RISC-V comes without the risk of architectural lock-in. This has two advantages, first, you are free to switch IP vendors. Your software stack will still work the same because you adhered to an open standard. Second, you can provide your own extensions on top of the main RISC-V standards, and the company can provide very concentrated value-add without the need to maintain or develop commodity.

Let’s have a look at what we’ve done at Axelera.
At the heart of our Metis AI Processing Unit, there are four AI Cores. Each AI core provides a 512×512 matrix-vector multiplication (MVM) in-memory compute array and a vector datapath that operates on streams of data.

To provide a generic solution that can keep up with the fast evolution in the field of neural networks we kept the datapath control as flexible as possible. We aim to push as much as possible into low-level driver software where we can innovate, correct, and adapt throughout the product’s life cycle. Therefore, each AI Core has a dedicated RISC-V, application-class core, that has full control over the datapath unit. A 5th system controller provides SoC management and workload dispatch to the individual AI cores. We’ve confidently chosen the RISC-V ISA because we could start virtual prototyping and building our software stack on open and free software far before we had done any IP negotiations. Furthermore, we knew that the IP landscape would provide us with sufficient competitive choices for our needs.

In roughly a year, we went from concept/FPGA prototype to tape-out, and in just a couple of days after having received the chips back, we could run the first end-to-end neural networks on our lab benches. Taking the limited resources and this aggressive time schedule into account it was clear that we’ll needed to buy proven IP without long negotiations to kick-start our developments.

Choosing to proceed with RISC-V ensured that we could continue to innovate in future generations and further distinguish our offerings.

So, is it all love, peace, and harmony? Not exactly. Like anything that’s still a work in progress. RISC-V has its own set of hiccups and curveballs.

An ISA is mainly about instructions, but a processor is part of a bigger system that includes memory, peripherals, and other processors. This broader setup, often called a platform, isn’t covered in the main ISA spec. The RISC-V Foundation aims to standardize these aspects too.

However, details matter. Things like atomic handling in multi-hart systems may only be standardized for specific platforms. Also, some IP providers might have legacy elements because they developed solutions before RISC-V standards were set. Standards are scattered, so you have to keep an eye out. And given the rapid growth, it’s tough to stay updated on all the ongoing activities.

Looking ahead
Remember those room-filling mainframes and how technology has continuously shrunk down to fit our pockets and wrists? Just like those evolutionary leaps, RISC-V aims to be a cornerstone in the next phase of computational progress.

In an ideal future, RISC-V will be as “boring” as any well-established technology—boring in the sense that it’s reliable, stable, and so embedded in our daily lives that we take it for granted. The long-term vision is for software to run seamlessly on RISC-V-based processors without users having to worry about compatibility issues.

In the short term, however, there’s still much to be done. While RISC-V is showing promise in deeply embedded systems like smartcards and IoT devices, achieving the same level of support and efficiency as ARM and x86 in more user-centric applications—like your phone or computer—is an ongoing project. Industry trends, as evidenced by companies like SiFive and Ventana and initiatives like the RISE project, suggest a competitive and collaborative future for RISC-V. From a research standpoint, its open nature makes it a fertile ground for innovation across the entire hardware-to-software stack.

There’s a palpable sense of momentum behind RISC-V, with heavyweight stakeholders and top-tier companies pooling their resources and expertise. Far from slowing down, the ecosystem around RISC-V is only gaining speed, setting it on a course to become as ubiquitous and “boring” as the microprocessors that have come to define modern life.

So, looking at where things are going, ignoring RISC-V would be like missing the boat when we switched from those giant mainframes to desktop PCs. It’s a game-changer, and you really don’t want to be left behind.