Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support multiple URL to crawl #143

Open
maaaaz opened this issue Oct 31, 2024 · 1 comment
Open

Support multiple URL to crawl #143

maaaaz opened this issue Oct 31, 2024 · 1 comment

Comments

@maaaaz
Copy link

maaaaz commented Oct 31, 2024

Hello there,

Currently ODD only allow 1 URL at the time with the -u option.
However, in multiple situations, a user can need to crawl different multiple URLs (just as the -i option in wget).

Can you add this feature allowing to specify a list of URLs to crawl with ODD ?

Cheers !

@KoalaBear84
Copy link
Owner

KoalaBear84 commented Nov 1, 2024

Hi. Sorry, this won't be supported. It's not easy to implement, and it is easy to do in almost every command line.

On Windows it's possible with a bach file (example) and with a file (urls.txt):

for /F "tokens=*" %%A in (urls.txt) do OpenDirectoryDownloader.exe --json --upload-urls --speedtest --quit --url %%A

Or by piping the command's together:
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_pipelines?view=powershell-7.4
https://www.geeksforgeeks.org/piping-in-unix-or-linux/
https://lucadistasioengineering.com/wincmd-novice/04-pipefilter/index.html

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants