Video uploads

My little website lets users upload videos of themselves (no its nothing dirty, its educational in fact) I use a streaming distribution from CloudFront so people can view their videos.My Webserver is co-located in NorCal, and the code uploads videos to the Webserver before sending it out to the CDN.

Things have been going well enough to where I have users uploading videos from Europe/Asia etc. Which is the problem. A lot of them seem to go away frustrated because upload times are so loooong.

I’m thinking of sticking a Flash drive in my Webserver for the video upload folder (seq writes are faster than disk at least until the cells wear out) . Not clear if this would make a difference if network latency is the issue.

I was wondering if others had run into this issue and for suggestions in general.

A SSD won’t help if it’s not a server problem (I’d expect complaints from the US as well if that was the case).

Assuming they are not bandwidth constrained on their upstream, you are probably running into buffer size problems (specifically the receive window on the server and send buffer on the client). As latency increases you need larger buffers to allow for more data to be in flight (particularly with higher bandwidth connections). You only have control over the receive window on the server so you might not be able to solve it with tuning but if the server’s buffers are the constraining factor you can potentially increase them.

What OS and software are you using for the server (that terminates the TCP connections) - including versions?

You could also use small EC2 servers regionally to relay the files which would reduce the latency and work around the problem more effectively but that carries more costs with it.

Thanks Patrick, the buffer explanation makes sense.

My webserver is running Linux kernel version 2.6.18-8.el5 (hey it works). Apache 2.2, but want to upgrade to 2.4 ASAP. PHP 5.3.6.

I like the EC2 idea. But I’m going to need a lot of memory since some of these video files are quite large, which could drive up costs. I’ll take a look anyway at the cost angle.


You can tune the receive buffer size for linux:

You can also specify it for apache but it’s better to do it for lnux and enable window scaling:

That should buy you some performance for large uploads over high latency (though you will still be subject to the client’s send buffer which you don’t control).

That’s an awesome site. I’ll play with the values on my dev server first.

I’m curious to know how it would affect the other connections that are serviced by my Webserver. This should be fun.

OK, I spent some time going over the kernel tuning parameters and picked and chose which ones to apply to my Webserver. It definitely helped non-US visitors because I’m seeing videos from Europe and Asia now.

On another note, is there a way to simulate what happens when I get a lot of visitors at the same time? is great at pulling up single pages. I ran ab from home to my Website before and after making these changes. I was quite dismayed that before making the changes I was getting huge latencies with ~10 simultaneous connections.

It would be great to be able to run something like webpagetest from various locations around the globe simultaneously. There would be huge potential for abuse though I think.

What you are looking for is usually called “load testing” (and it is usually something you have to pay for). Browser Mob has a pretty good low-cost offering in the space that uses real browsers.

If you just need to generate a lot of load and then separately measure how the server is doing, Apache Bench is pretty popular: