Earned my first silver medal on Kaggle! My solution combined a DeBERTaV3-large model with additional smaller models (RoBERTa, DeBERTaV3-base) and utilized Optuna for hyperparameter optimization. I also integrated an ensemble of XGBoost, CatBoost, and LightGBM models with a 4-fold cross-validation approach.
-
Notifications
You must be signed in to change notification settings - Fork 0
First silver medal in competition mode (PB)
acardoco/kaggle_commonLit
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
First silver medal in competition mode (PB)
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published