Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Training progress verbosity #73

Open
gjdv opened this issue Nov 20, 2024 · 1 comment
Open

Training progress verbosity #73

gjdv opened this issue Nov 20, 2024 · 1 comment
Assignees

Comments

@gjdv
Copy link
Collaborator

gjdv commented Nov 20, 2024

It would be nice to have a bit more insight into the training progress (ETA etc). For a grid search, one can set a verbosity level that is taken into account by sklearn, but sklvq does not provide such a verbose argument.
To serve my needs, I have solved it with a callback class like:

from tqdm import tqdm
class ProgressTracker:
    def __init__(self, max_runs):
        self.run_tracker = iter(tqdm(range(max_runs + 1), desc="Training progress"))

    def __call__(self, state):
        next(self.run_tracker)
        return False

This prints e.g.,:

Training progress:  11%|█         | 1115/10001 [00:45<06:01, 24.57it/s]

Perhaps you can consider adding some functionality like this to the LVQBaseClass class such that calling fit(verbose=1) prints this (not sure if it is possible to have multiple callback objects in the solver_params or whether there is a central place that is visited for each run where this code could reside); or alternatively, add the above as an example in the example usages / documentation.

@rickvanveen
Copy link
Owner

Adding it to the docs seems the best solution (for now). I will have a look after updating some other things.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants