Model-Based Design Accelerates Functional Verification
Using model-based design lets the development team move through the process in a systematic way. It also minimizes verification headaches later.
Functional verification consumes from 50% to 65% of the development time and budget for today’s system-on-chip (SoC) projects. With fragmented tools and development methodologies that rely on manual scripting, collections of disparate verification tools, inter-tool incompatibilities with almost-standard transaction languages, and mismatched database sources, functional verification can itself become a project that deflects attention from the main development goals. Instead of optimizing and debugging product designs, engineers spend time debugging their workflow and compensating for gaps in their verification tool chain.
Chip-level functional verification must be examined in a larger perspective. Gary Smith identified this issue succinctly prior to the 2006 DAC: “It’s the software, stupid.” He and Daya Nadamuni pointed out the critical, integral role of software in today’s SoC projects. “You can have elegant algorithms, first-pass silicon, and fancy intellectual property (IP),” said Nadamuni. “But without software, the product goes nowhere .”
Most handheld devices today illustrate this larger perspective, as do the development challenges in Software Defined Radio (SDR) and its siblings: the Joint Tactical Radio System (JTRS), and Cognitive Radio. Given the onsite reconfiguration intelligence demanded by these systems , one could call them the perfect storm of hardware/software co-design challenges.
The question you have to ask is, “Is each task an expense or is it an investment?”
Figure 1: Model-Based Design workflow
In Model-Based Design, the system model is an executable specification that becomes the gold standard for design and verification activities. During the design phase, this model is shared among the various design teams, including systems, software, and hardware engineers. Each team uses the model to elaborate its components, all the while ensuring that the design intent of the original system is still maintained. Figure 1 shows a typical model-based design workflow.
Model-Based Design proves essential to merging the efforts of verification and design. The model of the system and its operating environment, minus the component being designed, becomes the test bench and analysis environment for each component. This approach compensates for the difficulty in verifying market-leading systems that lack standards-based, well-defined interfaces between internal components and subsystems. It also eliminates nonproductive effort often expended to create tests for individual blocks and interfaces.
An engineer verifying HDL or embedded software typically asks: “Is my implementation good enough?” The critical follow-up question should be: “What is good enough?” The answer to this latter question is already contained in the algorithm specification created by the algorithm or systems engineer.
Using co-simulation, that executable specification is employed as a test harness by the verification team to ensure correct system functionality. The co-simulation allows algorithm engineers as well as the hardware and software engineers to work in their own specialized environments yet makes the impact of their design and implementation decisions clear to the entire team. The original effort associated with designing the executable algorithm specification is transformed from an expense into an investment with on-going payback.
Adopting Model-Based Design Incrementally: Component-Block Verification
Component-block level development can be done independently and in parallel for high initial productivity. Algorithms can be individually explored and optimized but component interdependencies will be left untested.
Divide-and-conquer approaches worked well for functional verification in the past because the system building blocks were naturally separate and independent. Conversely, because of functional convergence within today’s products, the analog, digital, interface, and processor blocks in SoCs are extremely interdependent. Fully verifying any block requires full exploration of its independent functions and its interactions with other parts of the design. Verifying the interactions across domain boundaries is where the divide-and-conquer approach critically fails. Figure 2 illustrates some of the difficulties involved.
Verifying the Design in the System Context
Figure 2: Systems are multidiscipline; your verification environment
should be too.
No amount of component-block-level verification can explore the interfaces and dependencies between the various domains and uncover issues that could show up once the product is in actual use. Merging the algorithm blocks with the rest of the system is essential to prove out the design.
Broadcom used Model-Based Design to accelerate the development of the SPINNER family of WCDMA semiconductor products for 3G mobile devices, easing the challenge of translating the sheer complexity of the WCDMA system into a functioning piece of hardware. Broadcom used MATLAB and Simulink to model various subsystems, including the base station, a wireless channel model, and the handset. Each subsystem served as a unit test harness for the others. They were able to quickly partition the system model into architectural subsystems to enable the hardware designers to implement each piece more easily.
Evaluating the optimal implementation for portions of a system is often best done after the algorithms are defined and validated. The system architecture may need to be explored to identify the most appropriate target implementation.
For example, at Rice University in Houston, Brogioli, Radosavljevic, and Cavallaro explored the partitioning tradeoffs in a mixed DSP/FPGA implementation for 3.5G HSPDA mobile handsets and found performance gains of 90% with the appropriate partitioning. 
This process of partitioning and implementing from the system model can be dramatically accelerated. Automatic code generation can be used to generate the embedded code for the processor and the HDL and test bench for the FPGA. Co-simulation links close the verification loop to compare each implementation against the golden reference system model. That original system model with the developed analysis code, system metrics, and other tests can be reused as the verification platform.
Mixing Levels of Abstraction for Accuracy and Completeness
Development rarely proceeds linearly from high-level abstractions to ever more detailed representations of the entire design. Model-Based Design facilitates mixing levels of detail at will, which streamlines the ability to improve the real-world detail of the functional simulation because the appropriate abstraction level can be chosen as needed to maintain very fast system-level simulation speeds while exercising the block under test with enough detail to verify functionality and prove out the implementation choices in the context of the entire system.
In their paper entitled “DVB-T System Analysis using System Level Co-simulation ,” the authors describe how the co-simulation interface between MATLAB, Simulink, and Cadence Virtuoso AMS Designer offers the system designer a wider choice of mixing abstraction levels by integrating the system design with the Virtuoso AMS Designer mixed-signal simulation, which also provides fast turnaround cycles and design insight. They also describe the difficulty inherent in non-Model-Based Design modern design flows: “Each tool is optimized for a specific level of abstraction and application area. Even though there is a certain range of overlap between the tools, difficulties arise when effects need to be analyzed that span different levels.”
Model-Based Design ROI Multiplies When Applied Across Teams and Projects
Using Model-Based Design lets the development team move through the process in a systematic way. ROI with Model-Based Design increases significantly when the specialized skills of each team member are leveraged by other teams during component and system verification. This leveraging occurs both downstream in the flow toward implementation and final verification and upstream to enhance the earlier design exploration; it ultimately works to improve understanding at all stages of development. The collaborative use and reuse of a common system model prevents gaps in the functional verification process, improves the quality of the final product, reduces duplicate and wasted effort, and improves the entire development process. Model-Based Design can be adopted in an incremental way and the maximum benefits are realized as all of its elements are adopted across the entire project. In this way Model-Based Design effortlessly emerges as an investment that continues to pay benefits in all aspects of complex system development.
 “Dataquest to EDA: 'It's the software, stupid',” Rich Goering, EE Times, July 24, 2006.
 “Next-Generation Design Issues in Communications,” Bruce Fette, Mieczyslaw M. Kokar, Mark Cummings. Portable Design, March, 2008, p. 20.
 “Hardware/Software Co-design Methodology and DSP/FPGA Partitioning: A Case Study for Meeting Real-Time Processing Deadlines in 3.5G Mobile Receivers.” Michael Brogioli, Predrag Radosavljevic and Joseph R. Cavallaro. Rice University. http://www.ece.rice.edu/~rpredrag/doc/MWCAS.pdf.
 “DVB-T System Analysis using System Level Co-simulation.” CDNLive! Conference paper 2008. Munich, Germany, April 28-30. Hans-Werner Groh – Atmel Germany GmbH, Walter Hartong – Cadence Design Systems GmbH, Uwe Eichler – Fraunhofer Institute for Integrated Circuits.
The MathWorks, Inc.
This article originally appeared in the May, 2008 issue of Portable Design. Reprinted with permission.