Impact of NBTI on the temporal performance degradation of digital circuits

Sch. of Electr. & Comput. Eng., Purdue Univ., West Lafayette, IN, USA
IEEE Electron Device Letters (Impact Factor: 3.02). 09/2005; 26(8):560 - 562. DOI: 10.1109/LED.2005.852523
Source: IEEE Xplore

ABSTRACT Negative bias temperature instability (NBTI) has become one of the major causes for reliability degradation of nanoscale circuits. In this letter, we propose a simple analytical model to predict the delay degradation of a wide class of digital logic gate based on both worst case and activity dependent threshold voltage change under NBTI. We show that by knowing the threshold voltage degradation of a single transistor due to NBTI, one can predict the performance degradation of a circuit with a reasonable degree of accuracy. We find that digital circuits are much less sensitive (approximately 9.2% performance degradation in ten years for 70 nm technology) to NBTI degradation than previously anticipated.

1 Follower
  • Source
    • "Both models can support different input stress types depending on the solving method: only for DC, AC stress (duty factor dependent) or any kind of input stress including the non-periodic workload. Paul et al. in [24] pioneered the work by performing NBTI analysis through the R-D model in case of the DC voltage stress that resulted in the pessimistic outbound of the BTI degradation. Wang et al. [11] and Kumar et al. [30] applied the signal probability and the activity factor concept to the R-D model, where a non-periodic input stream is converted to its equivalent periodic stream. "
    [Show abstract] [Hide abstract]
    ABSTRACT: In deeply scaled CMOS technology, time-dependent degradation mechanisms (TDDMs), such as Bias Temperature Instability (BTI), have threatened the transistor performance, hence the overall circuit/system reliability. Two well-known attempts to model BTI mechanism are the reaction-diffusion (R-D) model and the Atomistic trap-based model. This paper presents a thorough comparative analysis of the two models at the gate-level in order to explore when their predictions are the same and when not. The comparison is done by evaluating degradation trends in a set of CMOS logic gates (e.g., INV, NAND, NOR, etc.) while considering seven attributes: 1) gate type, 2) gate drive strength, 3) input frequency, 4) duty factor, 5) non-periodicity, 6) instant degradation versus long-term aging, and 7) simulation CPU time and memory usage. The simulation results show that two models are in consistency in terms of the gate degradation trends w.r.t. the first four attributes (gate type, input frequency, etc.). For the rest of the attributes, the workload-dependent solution of the Atomistic trap-based model is superior from the point of non-periodicity and instant degradation, while the R-D model gets advantageous in case of long-term aging, and simulation CPU time and memory usage due to its lite AC periodic and duty factor dependent solution.
    IEEE Transactions on Device and Materials Reliability 03/2014; 14(1):182-193. DOI:10.1109/TDMR.2013.2267274 · 1.54 Impact Factor
  • Source
    • "While the longterm model abstracts circuit operation patterns into aging calculations, the aging-aware library serves as the key bridge between device-level reliability effects and large-scale digital circuit analysis. Previous works use complicated methodologies to predict delay shift due to NBTI [46]–[50]. To reduce the computation cost in such a process, a simple gate delay model is proposed in this section. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Integrated circuit design in the late CMOS era is challenged by the ever-increasing variability and reliability issues. The situation is further compounded by real-time uncertainties in workload and ambient conditions, which dynamically influence the degradation rate. To improve design predictability and guarantee system lifetime, accurate modeling, and simulation tools for reliability are essential to both digital and analog circuits. This paper presents cross-layer solutions for emerging reliability threats, including: 1) device-level modeling of reliability mechanisms, such as transistor aging and its statistical behavior; 2) circuit-level long-term aging models that capture unique operation patterns in digital and analog design, and directly predict the degradation; and 3) simulation methods for very-large-scale designs. Built on the long-term model, the new methods significantly enhance the accuracy and efficiency of reliability analysis. As validated by silicon data, these solutions close the gap between the underlying reliability physics and circuit/system design for resilience.
    IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems 01/2014; 33(1):8-23. DOI:10.1109/TCAD.2013.2289874 · 1.20 Impact Factor
  • Source
    • "Existing approaches use delay as aging monitor [9] [10] [11] and model it as function of a parameter drift, obtained either analytically or by simulations. The delay models are commonly based on the Sakurai's a power law MOSFET model [1], to express the transistor current, which does not take into account the prevalent effects characteristic for present nanometer technologies. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Accurate age modeling, and fast, yet robust reliability sign-off emerged as mandatory constraints in IC design for advanced process technology nodes. This paper proposes a device-level aging assessment and prediction model using the signal slope as aging quantifier, that accounts not only for the intrinsic self-degradation but also for the influence of the surrounding circuit topology. Experimental results indi-cate the validity of slope as aging quantifier and that aging is underestimated when topology influence is disregarded.
    Microelectronics Reliability 08/2012; 52(9-10):1792-1796. DOI:10.1016/j.microrel.2012.06.056 · 1.43 Impact Factor
Show more


Available from