S3 reasonable file size (download) and alternatives

Hi,
using DreamFactory REST wrapper, I’m downloading from AngularJS a file lying on a S3 bucket.
So I call DreamFactory using an Authenticated GET with include_properties=true&content=true therefore, the file is base64 encoded and then served.
Of course it works until the file reaches a given size: you hardly can download a big file this way (1 MB is fine, more or less, but I suppose it depends on many factors: browser memory, bandwith, server and the like).
So what is the “right” DreamFactory way of downloading large files? Creating and external service (whatever language/platform) then adding it to DreamFactory list of services? Something else?
Thanks.

Carlo, have you experienced a specific performance degradation or are you just anticipating the resource hit needed to encode larger files? What size files are you wanting to serve?

Generally speaking, if there is some functionality you wish to incorporate that DSP is not built to handle, the best way is to invoke it through a remote web service. Obviously, that service would have to already exist, or you would need to build it.

Hi Drew,
I don’t have problem getting ~ 1MB files but a 2MB crashes Chrome.
Regarding the (new) web service: of course in this case I’d have to build it and that’s what I’ve started doing.

Do clients other than Chrome crash upon retrieving the file? If the DSP server isn’t responding with a 500 error of some sort then it does not appear to be the DSP that’s crashing. Can you fetch these large files from a client like CURL?