Advertisement

Blog

Full Chip Simulation

I'd like to continue my thoughts on simulators for mixed-signal evaluation, a topic we discussed last week.

Each piece of a design can be verified separately, but there comes a time when the whole thing has to be assembled. This is often the first time it's possible to see if the specification was correct. End-to-end user scenarios must be shown to operate correctly, and this is often when the biggest problems arise for mixed-signal verification. I can think of four possible ways to do this. Each has some advantages and disadvantages. Unfortunately, there is no option that addresses all possible needs or requires no additional effort.

Lowest common denominator
The entire circuit could be modeled at the transistor level and simulated using a SPICE-type program. It may be able to detect connectivity issues, and getting the transistor-level model is a fairly automated process. The biggest problem is simulation performance. Even a simple run would take a long time, and end-to-end user scenarios are probably impossible. In addition, SPICE simulators may not be able to handle the capacity. This option is probably a nonstarter for us.

Highest denominator
Assuming that all the analog circuitry could be modeled at the gate level, it would be possible to perform a much faster simulation. Logic simulators are having problems dealing with full chip simulations, but it could be done. It would also be possible to migrate the design on to an emulator or FPGA prototype. However, the accuracy would be very low, and it is unclear if this would accomplish much.

Mixed abstraction
This would seem like the logical choice. Leave everything in the abstraction for which it was designed, and use a mixed-signal simulator. The problem here is Amdahl’s Law, which basically says the whole thing will be brought down to the speed of the slowest component — in this case, the analog simulator. Also, as far as I know, most SPICE simulators are not good at simulating multiple independent pieces of design at the same time. It wants to solve them all as a single set of equations, and this makes the whole thing a lot more complex than it needs to be.

Behavioral analog
Some people have been singing the praises of behavioral analog for a long time. Using it within a digital, event-driven simulator would make a lot of sense. This also enables higher levels of abstraction for the digital logic, improving the overall simulation performance. It seems like the perfect solution — except for a few small issues.

The first is modeling. Who will create these models, which will be used only for the system verification process, because they cannot become an integral part of the development process? What skills are required of the designers making the models? They must understand analog design and high-level modeling languages. And how are these models verified? At some point, they have to be compared to the low-level models, and small differences between the models can produce different results quickly. This makes comparison a difficult task.

Companies have to weigh the costs and benefits, but nobody seems to think an ideal solution has been found. Mixed-signal languages continue to be developed. Last week, Accellera released the latest version of the SystemC-AMS standard — something I will write about in an upcoming blog.

What methods are you using, and what do you see as the biggest advantages and disadvantages?

17 comments on “Full Chip Simulation

  1. DEREK.KOONCE
    March 25, 2013

    When modeling some old circuitry, I tend to input the entire circuit. Then break it down and save in parts. This allows me to look at individual sections and then generate a high-level model for the sub-section. This high-level model is used to test the next section. Each break-down test performed is to make sure the model pieces are working.

    In the end, I go back to the original and run it and wait for a long period of time.

    Overall, it is just to get a verification of test results that are in the old procedures that do not make sense. Thus my efforts are more in the realm of reverse engineering work.

  2. amrutah
    March 25, 2013

    Brian, Thanks for bringing up this topic.

       Most of the chips are getting complex (also the number of devices and nodes) with both digital and analog components sitting on the same chip.  Having a verilog-A or systemC along with spice would speed up things.  Another major requirement is the simulation tool handling the parallel processing of the circuits.

  3. Brad Albing
    March 27, 2013

    For most analog enginees (at least the ones I know) this analysis would be beyond their reach or capability.

  4. Brad Albing
    March 27, 2013

    So we need a 1000X increase, but the best we can get is a 10X? Looks like real progress is a long way off.

  5. Brad Albing
    March 27, 2013

    Hmm… OK, makes sense. Oh, wait – unsaid in your comment is whether or not you get good results w/ this process.

  6. amrutah
    March 27, 2013

    @Scott: You are right that having a set of toolboxes or libraries of cells will help speed up and improve the fullchip simulations, but the problem  is every chip, mainly analog will have different spec so a complete custom design is necessary.  Furthereven if the toolboxes or library cells are designed then the specifications have to be so wide that it can fit in any application.

  7. Davidled
    April 28, 2013

    Mainly, I think that chip designer might need this type of simulation tool for final chip validation. As application engineers, for example, Pspice calls the components which is called chip by library and simulate the all or part of circuit. Second, if wireless circuit or chip is involved in the system, “VSS system simulator” might be one of simulator tools. This tool includes radio and circuit designs, baseband signal processing with digital fixed-point implementation without going to actual test field.

  8. SunitaT
    April 30, 2013

    In behavioral analog verification way it is important to have verification plan in order to address the mixed signal problem at the SoC level. Verification planning is the process of using the spec to define what to check, not looking at the design and defining how to check. By planning what to check, we can cover all the features that are expected at the SoC level and not just what's designed into the block.

  9. SunitaT
    April 30, 2013

     Who will create these models, which will be used only for the system verification process, because they cannot become an integral part of the development process?

    Separate verification team is one of the option, where verification team will work in parallel with the designers team. Both teams will develop their respective blocks using the specs and are tested using a common testbenches. In this way lot of design issues will be verified at the initial stage of design development.

  10. BrianBailey
    April 30, 2013

    This is true but is still reported as being one of the most difficult aspects of verification. Specify too much to verify and you are wasting time and resources. Specify too little or the wrong things and bugs may escape. I have heard more people say that this is why verification is an art and not a science.

  11. BrianBailey
    April 30, 2013

    If we are not careful, we can get into a circular problem. If the verification team creates a behavioral model, then they are indeed making an independent interpretation of the spec. But how is that model verified. It can only be verified by comparing it against the design model, or having yet another model that determines if its functionality is correct. Now, I am not trying to get pedantic – this is always a problem with verification. It requires two models and both of them are in effect being verified at the same time. Now, in a top-down flow, the behavioral model, from which an implementation is being derived cannot be used as the second model. If we compare two models that share a single derivation, then we are only doing equivalence checking and not verifying that either of them are correct.

  12. SunitaT
    April 30, 2013

    I have heard more people say that this is why verification is an art and not a science.

    @Brain, I totally agree with you, being a verification engineer is very tricky. Just knowing the technical aspect about the circuit will not help the verification engineer. Sometimes they have to comeup with out-of-box solutions to debug the circuit. 

  13. SunitaT
    April 30, 2013

    From an analog perspective, engineers have been doing mixed-signal design for years; but these days neither analog nor digital engineers are completely prepared to enter each others' areas of expertise. Analog engineers find it difficult to model the behavioural models, and digital engineers find it hard to understand analog concepts. So to have a functional verification done it needs lot of coordination between analog and digital teams.

  14. SunitaT
    April 30, 2013

    @Brian, Nowadays tools provide behavioural model extraction from the design. Eventhen such models cannot be used directly as there are many issues in analog such as resistor/capacitor trimming, leakage current that needs to be taken care.  

  15. Brad Albing
    May 2, 2013

    >>Sometimes they have to comeup with out-of-box solutions to debug the circuit.

    Any examples of this?

  16. SunitaT
    May 2, 2013

    In a situation wherein a circuit operation is failed, then we start analyzing it in all corner cases to understand the reason of failure. If the reason for failure is a faulty component, then we immediately replace it. Even then if the circuit is failing then we again analyze from scratch. What-if replaced component is also faulty? This might also be one possibility. So instead of wasting time in re-analyzing its better to check device before replacing.

  17. amrutah
    May 14, 2013

    SunitaT: “Sometimes they have to comeup with out-of-box solutions to debug the circuit”

      I agree here on your point, but more important is the simulation plan coverage.  Covering the complete set of testcase for a platform itself is difficult.  For simulation of the chip creating the models for the board and real case analog input vectors is complex and time consuming.

     Finding the leakage currents of the order of μA in some part of the circuit when other part is functional is complicated and might miss out.  The engineer have to come up with different set of simulations for leakage analysis.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.