Rerun failed URLs


#1

Hi,

Sometimes when I run an extractor some URLs fail, fine I understand that some may time out etc but you should have a button to re-run JUST the failed URLs rather than having to run the whole URL set again.

I know I can download the log and insert just these URLs but this is messy and when I want to integrate it later it will only show the new URL results not the whole amalgamated list.

Could you please add a “Rerun failed” button that would then merge the data with the last result set, say if the previous run had only been done a few seconds ago.

Its frustrating and unfair using thousands of URL credits just to get one URL.

Thanks. PS great product


#2

I had the same problem couple of days ago. I worked around it using classic extractor via desktop app.

Other way I can think of is it do it via REST API and build the retry logic in there. Support told me that failed URL’s should not eat your credit.


#3

Hi,

I agree that maybe and option and that the railed URLs dont count towards the total. However, when you rerun the query in the hope of getting the failed ones the second time these do count. Thats why its unfair.


#4

I agree that a rerun failed button should certainly be available.

In fact, the best case would be 2-3 automatic attempts to rerun the failed URLs at the end of the run, so I don’t have to prune the log file myself and combine several runs that were supposed to be one.