This is a project to scrape RSS feeds from urls provide by users so that details can be easily viewed about those feeds by users.
Postgres will need to be installed to handle the storage of all the different feeds and users Go will need to be installed to run build and then run the application
Once you have go installed, use git clone on this repo and then navigate to it and run go install to ensure all necessary packages are installed to run the program.
Manually create a config file in your home directory, ~/.gatorconfig.json, with the following content:
{ "db_url": "connection_string_goes_here", "current_user_name": "username_goes_here" }
Replace the values with your credentials and the username can be whatever you would like to start.
login <username>- sets the current user in the configregister <username>- adds a new user to the databaseusers- lists all the users in the databaseaddfeed <name> <url>- adds a url to be scraped from with a nameagg <time-duration>- start the aggregator that will ping on the interval setbrowse [limit]- optional limit with a default of 2