Amazon S3 - Fatal Error - Allowed memory size exhausted in s3.php

chris.paschen

Chris Paschen
On a number of entries where we have some files uploaded to Amazon we are getting the following error message:

Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 76823192 bytes) in /home/.../public_html/components/com_fabrik/libs/amazons3/S3.php on line 1274

Although I thought that it might have something to do with the file size of the file uploaded to AmazonS3, there are other entries with larger file sizes that are working properly.

This just started happening on entries that were working fine until sometime in the past week. (no data has been changed in the fabrik items and the files haven't changed on S3).

Also, if I comment out line 1274, making that section of the s3.php file into this:

/**
* CURL write callback
*
* @param resource &$curl CURL resource
* @param string &$data Data
* @return integer
*/
private function __responseWriteCallback(&$curl, &$data) {
if ($this->response->code == 200 && $this->fp !== false)
return fwrite($this->fp, $data);
else
// $this->response->body .= $data;
return strlen($data);
}​

(i.e. commenting out "$this->response->body .=$data;")

Then things 'seem' to work properly. We can access the item and the images and video that is stored on S3 is working fine.

S3 integration is way out of my coding experience, so I'm not at all sure what is going on here.

Just wondering if anyone else has seen anything like this.

NOTE: I'd share the link to the page; however, these are restricted access (pay only) files, so I can't share the links :-(
 
First time for me using Amazon S3. I'm having the same issue. The fix above works for me as well. If it is a good solution, it sure would be nice have this incorporated permanently.
 
Looks like I'm running into multiple issues here.

1) The download script isn't working for large files because of the memory exhaustion issue.

2) Since I'm using Amazon S3 to store these files, if I turn off the download script, it is working file for large pdf, ppt, etc. However if I want to link to a video file such as an mp4, it pumps it into a video container. And the file src isn't correct anyway in the video container. For instance it give this src for the video as http://www.earthtosky.org/http://[b.../videos/Energy-Budget-Lin-Chambers-Part-1.mp4

rather than:
http://[bucket-name].s3.amazonaws.c...68&Signature=q2b0LI8xbE91pErU7+Xj/L0pyXA=">

In the file upload plugin, I have "show media in form" set to "no." That should apply to videos too. But I do understand that's primarily for images.

So, I'm not sure what to do for a solution. I'll continue to try to look for a work around.
 
To solve (2), I found that "custom" folder:

/plugins/fabrik_element/fileupload/element/custom

So, I can add direct links for my video files to the default view. That will work if I can't get the memory exhaustion issue fixed.

(sorry if I'm taking over this thread...seems applicable)
 
I suspect the reason it started failing without changing anything in Fabrik '/ s3 is simply that something else has changed in J! that is increasing the memory usage on your page loads, and you've just hit the 128M limit.

My advice would be to just whack your memory_limit up to 256M in php.ini. There isn't going to be any other fix, other than "use smaller files".

BTW, I'm thinking about grabbing the latest s3.php ...

https://github.com/tpyo/amazon-s3-php-class/commits/master

... as the copy we ship is about 5 years old. There hasn't been a huge number of fixes since then, but the author is still maintaining it.

-- hugh
 
I've never tried keeping MP4's on s3, and obviously nobody else has either. :)

I'll take a look, that should be a quick fix.

-- hugh
 
FYI, I tried the new s3.php code and it didn't help my issue. And bumping it up to 256 didn't help either. No biggie. We're trying to find a way to deal with my client's massive videos archive. I think *not* using the download script will work. And that /fileupload/element/custom folder will solve the other issue.

Thanks.
 
OK. But just to be clear, this isn't a "Fabrik issue", it's just an issue. When dealing with huge files that are stored remotely, where your server is acting as "piggy in the middle", it has to read the entire file from s3 in one gulp, so it can then serve it up to the browser.

I just committed a fix for that file path thing in videos, btw. Not that it'll help you much, but it needed fixing.
 
Correct. Fabrik is doing it's job. I'll be linking to the files. The only issue was that the S3-stored videos were being displayed in the video objects with the incorrect filename. It was appending the domain url to the filename + amazon S3 location. But you have fixed that. I'll use the custom folder anyway, because I just want links and not a video object.

My client will just not have a download script, which is just fine.

Thanks again.
 
Cool. Glad you discovered the custom model overrides btw - good work! I don't think anyone has ever discovered and figured those out unaided, and found a use for them before.

-- hugh
 
We are in need of some funding.
More details.

Thank you.

Members online

No members online now.
Back
Top