PHP running out of memory when uploading to s3 bucket - Laravel 5.2Get the solution ↓↓↓
The first thing you'll want to do is to upload a very small file. Let's say 100KB. Then a 1MB file. And so on, until the application crashes.
I know you'll prefer to use Laravel to upload the files to S3; but at this stage -depending on the file size you're uploading- you might want to use a command-line tool instead, like s3cmd. The reasoning behind this is that uploading within the webstack as you are doing is not convenient:
1-It does not scale. A file of 1MB might work, but a file of 1,00001MB might crash the application.
2-This requires you to let PHP consume a lot of memory. Therefore multiple separate processes doing the same operation might kill your web stack, generating downtime. Spinning this in a separate process -ideally with a queue- will let the system scale up.
3-In the case you insist with Laravel doing the upload, do it in such a way that it's the CLI doing the actual upload, and not the web stack. Again, use a queue for this.
As a general advice, when you find yourself increasing the memory limit to astronomic values to make your script work, it's very likely that you're doing something wrong that can be done much more effectively.
Share solution ↓
Link To Answer People are also looking for solutions of the problem: the payload is invalid.
Didn't find the answer?
Our community is visited by hundreds of web development professionals every day. Ask your question and get a quick answer for free.
Find the answer in similar questions on our website.
Write quick answer
Do you know the answer to this question? Write a quick response to it. With your help, we will make our community stronger.