Do Necessary Tools Exist For RISC-V Verification?

Existing tools can be used for RISC-V, but they may not be the most effective or efficient. What else is needed?

Semiconductor Engineering

 

Semiconductor Engineering sat down to discuss the verification of RISC-V processors with Pete Hardee, group director for product management at Cadence; Mike Eftimakis, vice president for strategy and ecosystem at Codasip; Simon Davidmann, founder and CEO of Imperas Software; Sven Beyer, program manager for processor verification at Siemens EDA; Kiran Vittal, senior director of alliances partner marketing at Synopsys; Dave Kelf, CEO of Breker Verification, and Hieu Tran, president and CTO of Viosoft Corporation. What follows are excerpts of that conversation. To view part one of this discussion, click here.

SE: What does a RISC-V verification flow look like?

Kelf: We see the verification of processors as a stack of activities, but there is lots of loopbacks in that stack. A lot of companies treat conformance to the ISA as a separate activity. They will do a 'Hello World' test as a first stack to make sure things are up and running, then they will run as many conformance tests as they can. They try and match the ISA, and then they start testing the micro-architecture. A lot of people stop there. What we see happen is as they run the conformance tests, they may appear to get conformance, and then they start writing the micro architecture tests, find some things that are broken and realize that conforming to an ISA is a lot more complex than just making sure the instruction run properly. As they go further up the stack, they get into, 'Can we verify the cores as they relate to the rest of the system? Can we boot an OS on it? Does it have the necessary performance? Can they profile the design to make sure the performance is correct and start those validation activities to reveal more bugs in the verification?' When we are testing a regular ASIC or regular core, you can run through all the verification activities, get really good coverage, and then do the validation on the end. You often do not have to go back to verification. With these processors, you do. You have to go backward and forward through that verification stack all the time. This really slows you down, and the whole verification of that architecture. The golden reference model is becoming critically important. The Imperas models are recognized as state-of-the-art industry standard models by a lot of people. We have been working with those models. Bringing in a really solid core reference model that you can rely on is becoming a critical thing. You can test the micro-architecture, you can test some of the interaction with the rest of the system against that core model, and really getting a clearer picture of what is going on in the actual processor.

 

To read the full Semiconductor Engineering article by Brian Bailey, click here.

##