Get a listing of all the urls on a website with this simple Linux command.

Posted: June 16, 2016. At: 8:14 AM. This was 1 year ago. Post ID: 9276
Page permalink.
WordPress uses cookies, or tiny pieces of information stored on your computer, to verify who you are. There are cookies for logged in users and for commenters. These cookies expire two weeks after they are set.

This command will return a huge listing of all the visitable url`s on the website. Give this a shot on other websites and see how you go.

[email protected]:~$ wget --spider --force-html -r -l2 2>&1   | grep '^--' | awk '{ print $3 }'

Very useful if someone wants to know what is actually on a certain website. This might take a bit of time to return the listing, but this does work very well when spidering a website. Below is sample output.

[email protected]:~$ wget --spider --force-html -r -l2 2>&1   | grep '^--' | awk '{ print $3 }'

No comments have been made. Use this form to start the conversation :)

Leave a Reply