systemcdigital-logic

How to simulate output delay using next_trigger() in SystemC?


I have been reading this upvoted answer on Stack Overflow: https://stackoverflow.com/a/26129960/12311164

It says that replacing wait(delay, units); in SC_THREAD to next_trigger(delay, units) in SC_METHOD works.

But when I tried, it does not work. I am trying to build adder module with 2 ns output delay. Instead of having a 2 ns output delay, the adder output is getting updated every 2 ns.

Design:

#include "systemc.h"

#define WIDTH  4

SC_MODULE(adder) {
  sc_in<sc_uint<WIDTH> > A, B;  
  sc_out<sc_uint<WIDTH> > OUT;
 
  void add(){
    sc_time t1 = sc_time_stamp();
    int current_time  = t1.value();
    int intermediate = A.read() + B.read();
    next_trigger(2, SC_NS);
    OUT.write(intermediate);
    cout << " SC_METHOD add triggered at "<<sc_time_stamp() <<endl; 
  }
  
  
  SC_CTOR(adder){
    SC_METHOD(add);
    sensitive << A << B;  
  }
};

I know how to simulate delay using 2 techniques: sc_event and SC_METHOD and the wait statement in SC_THREAD, but I would like to simulate the delay using next_trigger(). I have read the Language Reference Manual, but could not figure how to do it.

Simulated on EDA Playground here: https://edaplayground.com/x/dFzc

I think I need to trigger 2 NS after the inputs change, how to do that?


Solution

  • You will have to track state manually:

    sc_uint<WIDTH> intermediate;
    
    void add(){
        if (A->event() || B->event() || sc_delta_count() == 0) {
            intermediate = A.read() + B.read();
            next_trigger(2, SC_NS);
        } else {
            OUT->write(intermediate);
        }
    }
    

    The problem is that using next_trigger doesn't magically transform your SC_METHOD into SC_THREAD. In general, I find any usage of next_trigger inconvenient and there are better ways of doing this using sc_event.