We are in Need of someone to go to: [login to view URL] and click "login as guest".
In the "From Date" box write "01/01/2002" and the "Thru" box write "05/01/2019".
In the "Document Group" dropdown pick "Judgments".
In the "Document Description" dropdown pick "Judgment, Civil"
Then click search. You should see over 100,000 results. On the left side there will be a button next to each called "View". Click it.
Then on the left side we need to grab the following:
1) Case #
2) Rec Date (only grab the date, not time)
3) Creditor Name
4) Creditor Street Address
(normalize the creditor addresses so they all have the street, city, state, zip and they are accurate (verified by some API or google etc).
Some might be missing the zip code or the address might say "2345 Easto 75th Laney, Albany, NY 2200" but the actual normalized address is "2345 East 75th Lane, Albany, NY 22003")
5) Creditor City
6) Creditor State
7) Creditor Zip
8) Debtor Name 1
9) Debtor Name 2 (only if there is more than 1)
10) Debtor Name 3 (only if there are more than 2)
11) Court Name
12) Where Perfected
13) Perfected Date
15) Plaintiff Attorney (if available)
We would like this sent over in an excel or csv file.
* Only will hire someone if you can do this in 2 days from now.
30 фрилансеров(-а) в среднем готовы выполнить эту работу за $62
Hi, we have 100k records and 15 fields (columns). It's a aquite big database. Also we have to agree the address normalization. As if you want to use other API r so, it's by itself a separate project. Thanks
Greetings, I am an experienced professional scrapper and have done similar projects in the past. Same can be verified from my profile. Let me allow to assist you with your requirements. Thanks
Hello, I am scraping expert, I have did too many similar projects, Please check my feedback then you will know. Can you tell me more details? Then I will provide demo data for you. Thanks, Ramveer
Hi dear employer, as I have a 4 of my friends who are working with me so I can do this task in 2 days or soon and will scrap all the data which you need.
Hello, I have experience in web scraping with Python. I can use Selenium, Scrapy, BeautifulSoup and Requests to make the best web scrapers! I hope to work with you!
Hi, I have experience on web scraping. I have worked on the same projects. I am confident of completing this project on time. Let me handle this project. Thanks,
I logged and followed everything but after clicking view - on the pages there is not much details as you mentioned but if you can provide more details I am having an emplyer team, so will finish it asap