- Simple web crawler written in Go
- Given a root URL, it parses the page for web links and recursively checks each site, up to a max depth
- Returns a list of sites found and their frequency
- With Go installed, run: go build main.go
- Start the program with: ./main
- Enter a URL to search, omitting the protocol, HTTP is assumed
- Ex. google.com, twitter.com, etc.
- Enter a depth to search to
- Final output sorts URLs based on frequency
- Need better input validation for URLs, check multiple inputs
- Show progress of URLs found -> Timer shows scrape duration
- Utilize concurrency -> Done for web scraping
- Get cookies
- Enumerate all network calls made by a website