Replies: 1 comment
-
Related: Might be worth making it so it staggers the scraping and downloading. i.e. Instead of fetching the metadata for all of the models and then downloading all of the content at once, the code would fetch one model's metadata and download it, then move onto the next model, looping until you go through all the models in the list. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
When it takes forever to scrape a certain model's content, the current code freezes into infinity. It would be cool to have a config that says "It's been more than x minutes, let's move on."
This doesn't make much sense for just one model, to be fair. My use case is that I attempt to get all the videos from all the models I subscribe to.
These are my changes to the original config.json on the latest master commit, for context.
From what I can tell, the way it works is that the code will fetch all of the metadata first for every model and then download all of the videos/images/etc. In my case, one of the models (jessieminx) has a lot of data to parse (3000+ images, though I want only the 71 videos) and the code hangs for a lonnnng time.
If I were to give it a timeout config, I wish it would automatically say "alright jessie, it's been long enough lol, we'll move on and try again after all the other videos are downloaded."
Beta Was this translation helpful? Give feedback.
All reactions