Custom parsers won't help you to get rid of unwanted downloads and speed-up the download process. They are used to generate more downloadable addresses that are not present in the original page text. But the addresses you're trying to generate are already in the text page. So why bother?
You should use "Filters -> Excluded URLs" if you want to download only necessary URLs without anything else.
Maxim wrote:Custom parsers won't help you to get rid of unwanted downloads and speed-up the download process. They are used to generate more downloadable addresses that are not present in the original page text. But the addresses you're trying to generate are already in the text page. So why bother?
You should use "Filters -> Excluded URLs" if you want to download only necessary URLs without anything else.
yes but the link always get thumb only, need to replace for original link :(
Currently - you have to create 10 separate Custom parsers for 10 previews. But we're going to fix it in the next versions and will add some kind of "iterator" inside the custom parser to allow generate 10's or 100's of URLs from 1 parser.
Maxim wrote:Currently - you have to create 10 separate Custom parsers for 10 previews. But we're going to fix it in the next versions and will add some kind of "iterator" inside the custom parser to allow generate 10's or 100's of URLs from 1 parser.
yeah it works thanks mate.
I will wait for the next version, this is great software thank's you :)
OK, version 3.27 released with this feature (as well as several other cool features). Now you can use [$1-30$] in the <b>Result</b> line to create 30 URLs from a single Regular Expression match. Like this:
[#1]jp-[$1-20$][#3]
Here is the direct download link for the new version: