Skip to content
/ ExBERT Public

Improving DistilBERT's cross-domain performance using MoE architecture.

Notifications You must be signed in to change notification settings

jkhsong/ExBERT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 

Repository files navigation

ExBERT: Improving cross-domain performance of a DistilBERT Q&A bot via a Mixture-of-Experts (MoE) architectural modification

A detailed study on improving cross-domain performance of a DistilBERT Q&A bot via a Mixture-of-Experts (MoE) modification.

This project was completed as part of a 2-month long, open-ended study for CS 7643 - Deep Learning at Georgia Tech.

GitHub has difficulty rendering some PDFs--Please download and open 'Bring in the ExBert.pdf' to view the pdf of the article.

About

Improving DistilBERT's cross-domain performance using MoE architecture.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published