Implement this and i buy Web Dumper instead of SiteCrawler

yoshimitsu

New Member
Hello,

I am about to decide which website grabber i should buy (my system is macos leopard). The candidates are Sitecrawler and your product Web Dumper. Web Dumper seems the more professional and maintained app, but it lacks some features that Sitecrawler has (as far as i was able to discover):

1. Custom file type download support
To download custom file types with Web Dumper you have to edit the "Web Dumper Preferences" file in the "home/library/preferences" folder and add the file information like it is described in the maxprog forum "http://www.maxprog.com/forum/about75.html".

Why so complicated? In Sitecrawler i have to define just one rule in the user friendly gui to download only the url's that end with my custom file type (e.g. ".flv" or "xyz"). As a user i do not want to bother with file type info and editing programm files in text editors. That is tedious work. The user should not have to deal with this.

Please add a menu item in the Web Dumper GUI to add any new file extension the user wants.

2. URL string options
In both Sitecrawler and Web Dumper you can specify a [1-9] modifier to proccess multiple URL. But as far as i know only Sitecrawler has the option to specify an {string1, string2, string3} modifier in the URL so proccess URLs that have the strings "string1", "string2" and "string3" in them.

Please add these string options in Web Dumper.

3. Regular expressions
In Sitecrawler you can specify optional regular expressions to dowload only those URLs which match them.
Regular expressions are very powerful. But they should be optional so the normal user could ignore them and the power user benefit from them.

Please add the option to add regular expressions to Web Dumper.

4. Collect all URLs/links only and do not download anything
Sometimes i just what to know all the links on a website to decide later what i want to download. Could you please add the option to just collect all the links of a website that Web Dumper would normally download, instead of actually downloading them?
I would be nice to be able to browse that link list in Web Dumper and download all the links that are marked (or just a single one).
I would be also nice if it is possible to export that list into an rtf, html or cvs file.

If you know the freeware mac app "Integrity" (http://osx.iusethis.com/app/integrity) you should know what i mean.
(Of course without checking if the links on that site work)

5. Contents of the webfolder (dir or ls)
I know that any website leecher application can download only those files on a website folder that are linked in html documents on that site. Files that are on a website but are nowhere linked simply cannot be downloadet... unless you know the exact URL of that file on this site.

Could you please add the option to just try any combination of Strings to check if there is something in the website that has not been linked? For example: www.somesite.com/documents/({try any possible String that is 4 chars long}.pdf

Of course the best method would be to get the contents of a website folder via an "dir" or "ls" command, but i know this is not possible

If these 5 features are implemented i buy Web Dumper immediately.

Thank you

Best regands
 
Top