I don't really understand the question.
By creating functionality to sleep. you can repeatedly call the same code over and over again and simply configure how much time between each call.
So...
SBR Top-Rated Sportsbooks
|
Best Sportsbooks List
|
| |||||
#1 FanDuel | SBR rating 4.8/5 | Review | #6 BetRivers | SBR rating 4.1/5 | Review | #2 Caesars | SBR rating 4.7/5 | Review | #7 Fanatics | SBR rating 4.1/5 | Review |
#3 DraftKings | SBR rating 4.7/5 | Review | #8 Betway | SBR rating 3.8/5 | Review | ||
#4 BetMGM | SBR rating 4.6/5 | Review | #9 Borgata | SBR rating 3.5/5 | Review | ||
#5 bet365 | SBR rating 4.6/5 | Review | #10 ClutchBet | SBR rating 2.9/5 | Review |
Type: Posts; User: Maverick22 Search by Threads Advanced Search View New Posts
I don't really understand the question.
By creating functionality to sleep. you can repeatedly call the same code over and over again and simply configure how much time between each call.
So...
SBR doesnt seem like curly braces. but that is what i mean. the actual scraping code is not set on a time out. it just executes some tasks. and some code that controls the scraper will designate when...
while(1) **
doWorkToScrapeData();
sleep(2000);
**
protected final void sleep() **
How is setting a timeout/sleep time a challenge?
I would have a conversation with each developer and "designer".
After the whole thing is finished, I would try to get a copy of all the source code, including all the database scripts. and...
Plus... a dedicated server running a scraper makes your life easier... not harder.
Sometimes more computers is more complexity... but not in this case. Not in this case at all.
Dude... go to a pawn shop. Find the cheapest computer you can find. Put linux on it. Deploy all your code there. Then thank us later
You are paying 3000€ for a website scraper? That only scrapes one site?