0 */12 * * * /usr/bin/python3 /home/main.py >> file1.txt
0 */12 * * * /usr/bin/python3 /home/main2.py >> file2.txt
0 */12 * * * /usr/bin/python3 /home/main3.py >> file3.txt
This is what I've tried so far. /\
Explanation: Each of these scripts is just the same web scraper script scraping data from 14000 different links from a Google worksheet, each script will scrap a third of the links. I know when each script is running because they'll update a worksheet cell to "On". So when the cronjob starts 3 cells immediately change to "On" but only one section of the worksheet is updating with new data, telling me that only 1 of the 3 scripts is actually working, usually the second script (main2.py). Eventually, the second script stops (Doesn't finish) and the first script starts working. It's outputting no error information so I'm struggling to understand the issue. Also, running the task with only one script works fine but it takes roughly 10 hours to complete without any issues.
TLDR: All of them "Turn on" but only the second command is actually functioning correctly.
I'm doing this on a trial digital ocean ubuntu server FYI, is it possible that the scripts are too demanding for the server?
The only solution I can think of is to create a separate server for each script, but that's probably a bad idea(I don't know a thing about servers)