Feature Suggestion
Posted: 30 Jan 2024, 06:54
Hi there! I'd love to be able to retry ONLY the failed URL, rather than having to update and have it retry ALL of them. On a slower scrape, if it's pulling back like 2k images, but only a few of them fail, it still has to hit all of the URLs on the update. I know you can set the project to not download duplicates, but it still has to scrape the URL and check it before it moves on. It'd be amazing if it could only retry the failed URLs. Maybe a "retry" button.