A Worst-Case Analysis of Trap-Assisted Tunneling Leakage in DRAM Using a Machine Learning Approach

@article{Lee2021AWA,
  title={A Worst-Case Analysis of Trap-Assisted Tunneling Leakage in DRAM Using a Machine Learning Approach},
  author={J. Lee and P. Asenov and Manuel Aldegunde and Salvatore M. Amoroso and Andrew R. Brown and Victor Moroz},
  journal={IEEE Electron Device Letters},
  year={2021},
  volume={42},
  pages={156-159},
  url={https://api.semanticscholar.org/CorpusID:231724588}
}
A simulation flow using the NN model is proposed to find the worst RDD configuration among 5,000 candidates and it is demonstrated the worst-case leakage can be found with 96.7% probability using only 5.5% computational cost.

Figures from this paper

Design Technology Co-Optimization for the DRAM Cell Structure With Contact Resistance Variation

This work presents a dynamic random access memory (DRAM) design technology co-optimization (DTCO) methodology that allows the optimization of the DRAM cell structure in the presence of contact resistance variation, trap-assisted tunneling leakage variation, and storage node capacitance variation.

Endurance Prediction Based on Hidden Markov Model and Programming Optimization for 28nm 1Mbit Resistive Random Access Memory Chip

A state transition probability model based on Hidden Markov Model (HMM), which can predict the lifetime for different endurance failure modes, and an optimized programming algorithm to rescue the failing cells during endurance is proposed.

Investigation Into the Degradation of DDR4 DRAM Owing to Total Ionizing Dose Effects

Total ionizing dose (TID) effects of gamma rays were investigated on DDR4 dynamic random access memory (DRAM) and analyzed using TCAD simulations. In this study, we considered the operating states,

Logic-Compatible Asymmetrical FET for Gain Cell eDRAM With Long Retention and Fast Access Speed

A novel Asymmetrical FET (AsyFET) is proposed to enhance the retention of gain cell memory and is experimentally demonstrated based on standard 300mm logic foundry platform. In AsyFET, the

A machine learning approach to model the impact of line edge roughness on gate-all-around nanowire FETs while reducing the carbon footprint

This work presents a machine learning approach to model the impact of LER on two gate-all-around nanowire FETs that is able to dramatically decrease the computational effort, thus reducing the carbon footprint of the study, while obtaining great accuracy.

Device simulations with A U-Net model predicting physical quantities in two-dimensional landscapes

A modified U-Net is employed and two models are trained to predict the physical quantities of a MOSFET in two-dimensional landscapes for the first time and paves the way for interpretable predictions of device simulations based on convolutional neural networks.

Device-Simulation-Based Machine Learning Technique for the Characteristic of Line Tunnel Field-Effect Transistors

The ML-based RFR model is exploited to predict the effect of variability sources of line TFET under different biasing conditions and results are promising, reducing the computational cost of device simulation by 99%.

Characteristic Variabilities of Subnanometer EOT La2O3 Gate Dielectric Film of Nano CMOS Devices

This paper presents a comprehensive study and detailed discussion on the gate leakage variabilities of nanoscale devices corresponding to the surface roughness effects and finds capacitance and leakage current variabilities were found to increase pronouncedly for samples even with a very low-temperature thermal annealing at 300 °C.

Machine Learning Approaches for Electronic Design Automation in IC Design Flow

Computer science techniques such as pattern matching and machine learning can reduce the design time of VLSI circuits by working with large datasets and provide results that represent a significant level of quality.

Analysis of Artificial Intelligence in Medical Sectors

The decision tree algorithm is emphasized as one of the main technologies to implement data mining in this research, and it also explores how data mining technology is used in clinical medical diagnosis mining and analysis.

Trap-Assisted DRAM Row Hammer Effect

Through 3D TCAD simulations with single charge traps, we discovered a direct evidence to the mechanism of DRAM row hammer effect. It is governed by a charge pumping process, consisting of charge

High-sigma analysis of DRAM write and retention performance: a TCAD-to-SPICE approach

It is highlighted that the interplay between discrete traps and discrete dopants is ruling the leakage statistical tails and therefore can play a fundamental role in determining yield and reliability of ultra-scaled DRAMs.

Low leakage ZrO2 based capacitors for sub 20 nm dynamic random access memory technology nodes

During dynamic random access memory (DRAM) capacitor scaling, a lot of effort was put searching for new material stacks to overcome the scaling limitations of the current material stack, such as

Emerging Memory Technologies: Recent Trends and Prospects

This tutorial introduces the basics of emerging nonvolatile memory (NVM) technologies including spin-transfer-torque magnetic random access memory (STTMRAM), phase-change random access memory

Overcoming the reliability limitation in the ultimately scaled DRAM using silicon migration technique by hydrogen annealing

We demonstrated a highly reliable buried-gate saddle-fin cell-transistor (cell-TR) using silicon migration technique of hydrogen (H2) annealing after a dry etch to form the saddle-fin in a fully

Capacitor-less dynamic random access memory based on a III–V transistor with a gate length of 14 nm

Dynamic random access memory (DRAM) cells are commonly used in electronic devices and are formed from a single transistor and capacitor. Alternative approaches, which are based on the floating body

Highly Scalable Saddle-Fin (S-Fin) Transistor for Sub-50nm DRAM Technology

The S-Fin exhibits feasible transistor characteristics such as excellent short channel effect, driving current, and refresh characteristics as compared with both RCAT and damascene-FinFET.

Integrated atomistic process and device simulation of decananometre MOSFETs

In this paper we present a methodology for the integrated atomistic process and device simulation of decananometre MOSFETs. The atomistic process simulations were carried out using the kinetic Monte

Characterization and Modeling of the Band-to-Band Current Variability of Nanoscale Device Junctions

This paper presents a detailed experimental and numerical investigation of the variability of the band-to-band leakage current of p-n junctions in nanoscale MOS devices. The experimental results

Scikit-learn: Machine Learning in Python

Scikit-learn is a Python module integrating a wide range of state-of-the-art machine learning algorithms for medium-scale supervised and unsupervised problems. This package focuses on bringing