Feature Suggestion
-
- Posts: 1
- Joined: 30 Jan 2024, 06:48
Feature Suggestion
Hi there! I'd love to be able to retry ONLY the failed URL, rather than having to update and have it retry ALL of them. On a slower scrape, if it's pulling back like 2k images, but only a few of them fail, it still has to hit all of the URLs on the update. I know you can set the project to not download duplicates, but it still has to scrape the URL and check it before it moves on. It'd be amazing if it could only retry the failed URLs. Maybe a "retry" button.
-
- Site Admin
- Posts: 2431
- Joined: 02 Mar 2009, 17:02
Re: Feature Suggestion
OK, that makes sense. It's not going to work for the images with the expiration date in the URLs as you need to get a new URL from the page where the image is shown every time. But for simpler websites, it may work.