Friday, 14 June 2019

xilinx - Modelling delays in Verilog code?


I am using Xilinx ISE design suite for simuation of my Digital Circuit. I want to model delays into each individual element of my combinational circuit during simulation. I don't want to explicitly add delays into my module as it will bias my circuit. Is there a way?



Answer



There are at least three kind of simulations in FPGA design.


First kind is behavioral simulations, the fastest one. You really should explicitly add delays, but only to non-blocking assingment, to represent physically-existing propagation "clock-to-Q" delay on flip-flops. The value of delay should be taken from typical elements of FPGA registers, 20-50 ps is enough to get the circuit to behave correctly. Combinatorial logic doesn't need preemptive delays. As long as the prop delays are reasonably fit to used silicon thechnology, it won't "bias your circuit". Some compilers might do this job for you, but I always prefer the explicit delays.


Second kind of simulation is performed after mapping your logic into CLBs, which have known basic delays, so you don't need to change anything. Your pre-inserted delays will be ignored, so they do no harm.


The third round of simulation is run after place and routing. The tool will now know all delays nearly exactly, and back-annotate them into your netlist. Again, your artificial explicit delays will be replaced with real delays.


As you can see, the explicit delays in non-blocking assignments are harmless, but without them your design might behave incorrectly. You might make a mistake to tweak the design to meet your expectations of behavior, but this design might totally fall apart on next stages of compile process.


No comments:

Post a Comment

arduino - Can I use TI's cc2541 BLE as micro controller to perform operations/ processing instead of ATmega328P AU to save cost?

I am using arduino pro mini (which contains Atmega328p AU ) along with cc2541(HM-10) to process and transfer data over BLE to smartphone. I...