Bidirectional Autoregressive Transformer (BART) is a Transformer-based encoder-decoder model, often used for sequence-to-sequence tasks like summarization and neural machine translation. BART is pre-trained in a self-supervised fashion on a large text corpus. During pre-training, the text is corrupted and BART is trained to reconstruct the original text (hence called a "denoising autoencoder"). Some pre-training tasks include token masking, token deletion, sentence permutation (shuffle sentences and train BART to fix the order), etc.
-
Notifications
You must be signed in to change notification settings - Fork 0
Matheussoranco/Abstractive-Text-Summarization
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
About
Abstractive Text Summarization using BART
Topics
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published