dc.contributor.advisor | Mathias, P C | |
dc.contributor.author | Srinivasan, G | |
dc.date.accessioned | 2010-06-01T07:07:05Z | |
dc.date.accessioned | 2018-07-31T06:03:50Z | |
dc.date.available | 2010-06-01T07:07:05Z | |
dc.date.available | 2018-07-31T06:03:50Z | |
dc.date.issued | 2010-06-01 | |
dc.date.submitted | 2007 | |
dc.identifier.uri | https://etd.iisc.ac.in/handle/2005/692 | |
dc.description.abstract | Power supply noise, which is the variation in the supply voltage across the on-die supply terminals of VLSI circuits, is a serious performance degrader in digital circuits and mixed analog-digital circuits. In digital VLSI systems, power supply noise causes timing errors such as delays, jitter, and false switching. In microprocessors, power supply noise reduces the maximum operating frequency (FMAX) of the CPU. In mixed analog-digital circuits, power supply noise manifests as the substrate noise and impairs the performance of the analog portion. The decrease in the available noise margin with the decrease in the feature size of transistors in CMOS systems makes the power supply noise a very serious issue, and demands new methods to reduce the power supply noise in sub-micron CMOS systems.
In this thesis, we develop a new method to determine optimal time-delays between the switching of input/output (I/O) data buffers in digital VLSI systems that realizes maximum reduction of the power supply noise. We first discuss methods to characterize the distributed nature of the Power Delivery Network (PDN) in the frequency-domain. We then develop an analytical method to determine the optimal delays using the frequency-domain response of the PDN and the supply current spectrum of the buffer units. We explain the mechanism behind the cancellation of the power supply noise by the introduction of optimal buffer-to-buffer delays. We also develop a numerical method to determine the optimal delays and compare it with the analytical method. We illustrate the reduction in the power supply noise by applying the optimal time-delays determined using our methods to two examples of PDN.
Our method has great potential to realize maximum reduction of power supply noise in digital VLSI circuits and substrate noise in mixed analog-digital VLSI circuits. Lower power supply noise translates into lower cost and improved performance of the circuit. | en_US |
dc.language.iso | en_US | en_US |
dc.relation.ispartofseries | G21109 | en_US |
dc.subject | Electric Power System | en_US |
dc.subject | VLSI Circuits | en_US |
dc.subject | Power Distribution Network (PDN) | en_US |
dc.subject | Integrated Circuits - Very Large Scale Integration | en_US |
dc.subject | VLSI Systems - Noise Reduction | en_US |
dc.subject | Power Supply Noise | en_US |
dc.subject.classification | Electrical Engineering | en_US |
dc.title | A New Method To Determine Optimal Time-Delays Between Switching Of Digital VLSI Circuits To Minimize Power Supply Noise | en_US |
dc.type | Thesis | en_US |
dc.degree.name | MSc Engg | en_US |
dc.degree.level | Masters | en_US |
dc.degree.discipline | Faculty of Engineering | en_US |