Aller au contenu

Improving robustness of in-memory computing on passive TiO2 ReRAM crossbars using hardware-aware bayesian training

Date :
Cet événement est passé.
Type :
Conférences et séminaires
Lieu :
Local P2-1002 du 3IT (Institut interdisciplinaire d'innovation technologique) et via la Plateforme Microsoft Teams
Coût :
Gratuit

Description :

Neural networks are constantly growing in performance and in size. For very large networks, the computational demand due to the Von-Neumann bottleneck in between processor and memory requires the use of cloud computing. This renders their use for a plethora of applications that would greatly benefit from them intractable, especially at the edge. A solution to this problem is the use of new memristive devices (ReRAM), capable of representing neural network weights and performing neural network operations directly into memory, avoiding transfer to the processor.

There exists many types of ReRAM. Passive TiO2 ReRAM crossbar arrays (0T1R), commonly used for analog matrix-vector multiplications, are far superior to their active counterparts (1T1R) in terms of integration density. However, current ex-situ transfer of neural networks on this crossbar architecture is accompanied by important losses in precision due to hardware variabilities such as sneak path currents, conductance drift and conductance tuning imprecision.

This master's project seminar presents novel neural network training approaches adapting machine learning techniques such as dropout, the reparametrization trick and bayesian losses to measured hardware variabilities in order to help in generating models better adapted to their hardware transfer on passive crossbars. A software demonstration shows that training a network with previous knowledge of their expected variabilities leads to better performances.


Conférencier : Philippe Drolet, étudiant à la maîtrise en génie électrique