You have to do form submission through https request & get response . Details will discuss after apply. You must have to complete within one day
Please follow those things----------
1. The script should be undetectable.
2. Set a Real User Agent
3. Set other headers
4. Set random intervals in between your requests
5. Use a headless browser (advanced)
6. Put some random programmatic sleep calls in between requests, add some delays after crawling a small number of pages and choose the lowest number of concurrent requests possible.
7. Do not follow the same crawling pattern
Only robots follow the same crawling pattern because unless specified otherwise, programmed bots follow a logic which is usually very specific.
Sites that have intelligent anti-crawling mechanisms can easily detect spiders from finding pattern in their actions. Humans generally will not perform repetitive tasks.
Incorporate some random clicks on the page, mouse movements and random actions that will make a spider looks like a human.
It's always good to put some delay between requests. I use [login to view URL]() for that purpose where I pass a list of random numbers I would like to delay the service:
delays = [7, 4, 6, 2, 10, 19]
delay = [login to view URL](delays)
[login to view URL](delay)
47 фрилансеров(-а) в среднем готовы выполнить эту работу за $425
Hi. Let me know more details about this project and we can discuss more. I think we have the best tool for such a job. Thx, waiting for these details and hope to collaborate.
Hello. I have rich experience in this project, and I am sure I will complete this project in a day. I will do my best to complete it perfectly. Thanks.
Hello, I am python developer. I have many experiences with http request. I can do your project. Please contact me and discuss more. Thank you.