Home > Community > Forums > Digital Implementation > Use of virtual Clock in SDC

Email

* Required Fields

Recipients email * (separate multiple addresses with commas)

Your name *

Your email *

Message *

Contact Us

* Required Fields
First Name *

Last Name *

Email *

Company / Institution *

Comments: *

 Use of virtual Clock in SDC 

Last post Tue, Apr 1 2008 9:45 AM by archive. 1 replies.
Started by archive 01 Apr 2008 09:45 AM. Topic has 1 replies and 2023 views
Page 1 of 1 (2 items)
Sort Posts:
  • Tue, Apr 1 2008 9:45 AM

    • archive
    • Top 75 Contributor
    • Joined on Fri, Jul 4 2008
    • Posts 88
    • Points 4,930
    Use of virtual Clock in SDC Reply

    While working on a core physical design, do the I/O delays in the SDC file necessary to be modelled wrt a virtual clock to depict the top level clock. If i have a heirarchial clock in the design which in the top level chip would connect to the top level clock then can the I/O delays be modelled wrt the heirarchial clock which has physical existance in the core level design.


    Originally posted in cdnusers.org by karankoti@in.ibm.com
    • Post Points: 0
  • Tue, Apr 29 2008 7:49 AM

    • archive
    • Top 75 Contributor
    • Joined on Fri, Jul 4 2008
    • Posts 88
    • Points 4,930
    RE: Use of virtual Clock in SDC Reply

    Conceptually, a virtual clock is any clock that does not have sinks within the block you're working on, so when you're seeking to model IO delays relative to a top level clock that is not present in the block a virtual clock is a great way to model this. If instead, the clock is both at the top level -and- has sinks within the block you're working on you can define your IO delays relative to the clock and it would *not* be virtual. However, in this second scenario it is sometimes advantageous to still model IO delays relative to a virtual representation of the clock because it gives you the flexibility of defining what the virtual clock latency is with a single statement in your SDCs, whereas if you choose to model IO delays with real clocks (ie, non-virtual) the IO clock latency is determined by the insertion delay of the clock tree is as observed within the block. Optionally, you can include the source latency in the IO delay values, but then your IO timings are locked to a pre-determined latency value which is hard to adjust later since it requires updated each and every IO delay value.

    Hope this helps,
    Bob


    Originally posted in cdnusers.org by BobD
    • Post Points: 0
Page 1 of 1 (2 items)
Sort Posts:
Started by archive at 01 Apr 2008 09:45 AM. Topic has 1 replies.