The Designer's Guide Community Forum
https://designers-guide.org/forum/YaBB.pl
Simulators >> AMS Simulators >> Mixed Signal Verification Methodology using dfII
https://designers-guide.org/forum/YaBB.pl?num=1164839122

Message started by Peruzzi on Nov 29th, 2006, 2:25pm

Title: Mixed Signal Verification Methodology using dfII
Post by Peruzzi on Nov 29th, 2006, 2:25pm

I use Cadence's AMS simulator (ncsim -ams), which can be run from the Cadence hierarchy editor (CHED), from the Cadence analog design environment (ADE) or by executing ncelab and ncsim commands standalone, or as part of a script such as the runCompileElabSim script which can be generated with the CHED.

Here's what I envision for a test bench:

TB, (Top level testbench) schematic instantiates two symbols: DM, and SOCKET.
 DM (driver and monitor) drives all signals and monitors all outputs, and is a Verilog-AMS model
 SOCKET is a schematic, which instantiates the DUT and symbols for periferal components.
   Depending on the configuration view, the periferal components may contain schematics of matching networks, etc. or behavioral models.
   DUT (Device Under Test) has multiple configurations of schematics and behavioral models, RTL or structural Verilog digital blocks etc.

The design team of course creates and maintains the DUT, including schematics of analog circuits, and RTL or structural Verilog code for digital circuits.  Behavioral model writers (possibly, but not necessarily the block designers) create and maintain the Verilog-AMS models for analog circuits within the DUT. One or more people create and maintain the TB, including the TB and SOCKET schematic, SOCKET-level periferal schematics and models, and the DM model "shell".

An individual TEST consists of a configuration and DM code.  The DM model includes the module declaration, I/O list, and code which is common to all tests.  Additionally, the DM model uses `include statements to pull in external code for each unique test: digital vectors, analog signal parameters, measurement comparison and pass/fail routines for monitored signals.

A larger number of verification engineers write the custom DM code, run and debug the simulations.

All of the above is similar to digital verification approaches.  I've seen such digital approaches adapted to use Verilog-AMS models and netlists generated from schematics.  But for every one of them, the first step is to take all the models and netlists out of the dfII structure and put them into a traditional digital design hierarchy.  This eliminates the dfII benefits, including multiple configurations from the CHED.  I would prefer to embrace the dfII structure, and keep everything aligned with standard Cadence procedures.

We're finally at the bottom line.

Has anyone created a mixed signal verification methodology like the one I described, which embraces dfII, and are you willing to share it?

Would anyone like to collaborate with me on such a verification methodology?  Collaboration is strictly voluntary, and maybe we can publish a BMAS or CICC paper from our work.

Thanks!

Bob P.

Title: Re: Mixed Signal Verification Methodology using df
Post by jbdavid on Dec 3rd, 2006, 8:42pm

Hi Bob,
Welcome to the Designers Guide!
I'd love to have a collaborator..

Jonathan


Title: Re: Mixed Signal Verification Methodology using df
Post by Ken Kundert on Dec 4th, 2006, 7:42am

Bob,
   You might want to take a look at http://www.designers-guide.com/docs/cicc06-dgc.pdf, which was published at CICC. This is also what I do for a living now. Transitioning a design team to this methodology is a big job, if you feel you need some help, give us a call.

-Ken

Title: Re: Mixed Signal Verification Methodology using df
Post by Peruzzi on Dec 5th, 2006, 12:56pm

Hello Jonathan,

Thanks for responding.  First thing I'll do is read Ken's paper, then get back to you.

Bob



jbdavid wrote on Dec 3rd, 2006, 8:42pm:
Hi Bob,
Welcome to the Designers Guide!
I'd love to have a collaborator..

Jonathan


Title: Re: Mixed Signal Verification Methodology using df
Post by Peruzzi on Dec 5th, 2006, 1:02pm

Ken,

Thanks for the reference paper.  Reading it will be my starting point.

Luckily (I hope), I have a clean slate to work with in my design group.  I'm not transitioning them from an existing methodology, but building up from zero.  We don't have much baggage carried over from a digital methodology.  If stuck, I'll keep your professional services in mind.

Thanks,

Bob



Ken Kundert wrote on Dec 4th, 2006, 7:42am:
Bob,
   You might want to take a look at http://www.designers-guide.com/docs/cicc06-dgc.pdf, which was published at CICC. This is also what I do for a living now. Transitioning a design team to this methodology is a big job, if you feel you need some help, give us a call.

-Ken


Title: Re: Mixed Signal Verification Methodology using df
Post by Peruzzi on Dec 6th, 2006, 12:19pm

Jonathan, Ken,

As a matter of fact, I was at Ken's CICC presentation of his paper, and used as the basis of my self-checking test bench.  I implemented it in Cadence schematics.  The top level is the test bench, which instantiates a driver/monitor and the socket.  (I prefer naming it socket rather than collar as in Ken's paper -- and the socket works exactly the same as Ken's collar -- delivering internal nodes from the DUT to the driver/monitor at the test bench level.)

The socket instantiates the DUT and some matching circuitry.  The setup works with several configurations of model levels, including verilog-D using real versions of analog I/O.  I've run it from AMS-in-ADE, from the hierarchy editor, and from the Unix command line using a generated runCompileElabSim script.

So, that is my starting point.

Now, I want to devise a text-based working environment for digitally oriented test writers and verifiers (TWV).

Ideally, the test writer and verifier, TWV, types something like "build_database" at the command line, without ever executing icfb.  This populates his or her local directory with exactly the necessary directories and files, and sets the necessary Unix variables.   The TWV then edits the file which gets included in the verilog.vams file in the verilogams view of the driver/monitor.

This included file selects one of the (previously compiled) DUT and SOCKET configurations, and customizes the analog and digital inputs, and accomplishes measurement and testing.  The TWV may start by modifying an example include file.

Then, the TWV types something like "run_simulation_1" at the command line, and waits for pass/fail results and a test_data file.

Finally, a collection of simulations become a regression suite: run_simulation_1, run_simulation_2 etc.

Once again, there's nothing new in this approach.  Digital folks do it every day.

The innovation here is to do so while closely embracing Cadence conventions such as the dfII structure.

Jonathan is now familiar with my company's approach, and says he's looking for a collaborator too.  Ken is doing this for a living, and I don't expect him to join in for free.

Jonathan, what do you say we take the discussion off line from the forum until we have results to share?  Ken is welcome of course, but I can't even ask for money to pay him.





Peruzzi wrote on Dec 5th, 2006, 12:56pm:
Hello Jonathan,

Thanks for responding.  First thing I'll do is read Ken's paper, then get back to you.

Bob



jbdavid wrote on Dec 3rd, 2006, 8:42pm:
Hi Bob,
Welcome to the Designers Guide!
I'd love to have a collaborator..

Jonathan


The Designer's Guide Community Forum » Powered by YaBB 2.2.2!
YaBB © 2000-2008. All Rights Reserved.