Skip to content

Conversation

@egillax
Copy link
Collaborator

@egillax egillax commented Feb 21, 2025

No description provided.

@jreps
Copy link
Collaborator

jreps commented Mar 10, 2025

This looks good - the only thing worth considering is whether instead of having a string 'evalmetric' as input into the model design we may want an evaluationMetricSetting created via a createEvalidationMetricSetting() that lets the user specify a function or a string name of a function and settings such as maximize (boolean). That way the package is more flexible to enable custom evaluation metrics. For backwards compatibility we should make the setting default to 'AUC' and maximize. Then we will need to add a table for this setting into the results tables - I'm happy to help with that part.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants