I have a pretty simple application that I need created for Linux (Ubuntu).
There's a web page with a simple table with data that I need scraped and parsed every 60 seconds. This web page is only accessible via HTTPS and is behind a secure login, which requires saved cookies (including a session ID).
I need a script that will grab the data on this page.
The script must avoid re-logins as much as possible. So, when the script runs a second time, it should be seen as a "refresh" to the web server. If the web server forces a re-login, then the script should recognize it and re-login.
Next, the script must parse the table, and put the data in a database (most likely MySQL).
After saving the data to the database, it will also need to create a simple text file (CSV) will the same data.
Please see below for a sample of the data were scraping.
Again, this should be an easy, straight forward application/script for someone who knows what they'll doing.
In your bid, please specify:
1. Which programming language(s) you'll be using.
2. How quickly you can create the software.
3. The exact amount of money you'll need for this project. I will not pay additional money to complete the work that is clearly explained here. (I may pay for additional functionality, but I will only pay your bid amount for the core functionality.)
If you have any questions, please let me know.
Application Name One, 2
Application Name Two, 0
Application Name Three, 0
Application Name Four, 1
This data would be parsed from a web page with a table containing this information and two additional columns (which we'll be ignoring).