Skip to content

mnaaseri/t5_multitask

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 

Repository files navigation

Multitask Training of Small T5 Model for Binary Classification and Similarity

In recent years, transformer-based models have demonstrated remarkable success across various natural language processing tasks. Among these, T5 (Text-to-Text Transfer Transformer) stands out for its versatility and effectiveness. This project delves into the training of a T5 model, specifically the small variant, for multitask prompting, simultaneously addressing binary classification and similarity tasks.

Objectives

  1. Multitask Learning:
    • Binary Classification
    • Similarity

Methodology

  1. Dataset Preparation:

    • Curating datasets tailored to binary classification and similarity tasks is essential. These datasets will be utilized to train and evaluate the model comprehensively.
  2. Model Training:

    • The small T5 model will undergo supervised training, employing multitask learning techniques to simultaneously optimize for both binary classification and similarity tasks. The training process will involve careful fine-tuning to strike a balance between the two objectives.
  3. Evaluation and Validation:

    • Rigorous evaluation metrics, including F1 score and accuracy for binary classification, and correlation coefficients for similarity, will be employed to assess the model's performance. Cross-validation techniques will further validate the model's robustness.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published