Skip to content

Regularization Techniques #8

@twnkl2713

Description

@twnkl2713

Participants can apply L2 weight decay or dropout on LoRA layers. These techniques help mitigate overfitting and ensure better generalization of the model.
Ensure that you've read the guidelines present in CONTRIBUTING.md as well as the CODE_OF_CONDUCT.md.

Metadata

Metadata

Assignees

No one assigned

    Labels

    hacktoberfestFor participants of hacktoberfest

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions