In my company, I have to program boards that run on embedded Linux.
We have a test sequence that opens an HTTP Apache server on port 80. It hosts 7 files that the board downloads.
The board uses wget to download the files hosted by the server, the connection is established but there is no response from the server. Thus, wget keeps trying to download until timeout.
Here is a picture that shows the issue, the test sequence uses a serial communication to monitor the board :
Screen of the serial communication
The first file the board is trying to download has a size of 53MB.
I tried to see if I could download the files manually on a browser: I can download all of them except the 53MB file. I get an "ERR_EMPTY_RESPONSE" error on Google Chrome.
I tried to reboot the computer and this time the board managed to download all the files from the server, including the 53MB one. And I could also download the files manually on Google Chrome.
But sometimes the problem comes back.
I tried to check the access log file of Apache and all the GET request have the HTTP code 200 which means OK, including the requests when I can't download the 53 MB file.
Edit: It seems that the problem comes from a file size limit. I reduced the 53 MB file to 49 MB by removing somes lines and I managed to download it. Then, I undid my changes and I couldn't download the file.
There are a lot of posts about 2 options in php.ini: post_max_size and upload_max_filesize.
In my php.ini file:
post_max_size = 8M
upload_max_filesize = 50M
However, I think it's only related to POST requests but I'm only sending GET requests.
But still, I modified those parameters to 60M and restarted the apache server but I couldn't download the 53 MB file.