Hi guys,
If anyon here has experience with Varnish / Nginx, I could use some help.
I need to set them up (emphasis on Varnish but Nginx help would be also very welcome) to cache and serve all static content, including video (h264 .mp4) steraming.
I am running Varnish 3.02-Streaming2
My VCL looks likee that:
My problem is that if I change the return from pipe (which is a passthrough) to lookup on the video stream, the video arrives slower than through pipe and broken (frozen images and I think some audio problems as well).
If anyon here has experience with Varnish / Nginx, I could use some help.
I need to set them up (emphasis on Varnish but Nginx help would be also very welcome) to cache and serve all static content, including video (h264 .mp4) steraming.
I am running Varnish 3.02-Streaming2
My VCL looks likee that:
Code:
backend default { .host = "mysite.com"; .port = "80"; } sub vcl_recv { if (req.http.Accept-Encoding) { if (req.url ~ "\.(jpg|png|gif|gz|tgz|bz2|tbz|mp3|ogg|mp4)$") { # No point in compressing these remove req.http.Accept-Encoding; } else if (req.http.Accept-Encoding ~ "gzip") { set req.http.Accept-Encoding = "gzip"; } else if (req.http.Accept-Encoding ~ "deflate") { set req.http.Accept-Encoding = "deflate"; } else { # unknown algorithm remove req.http.Accept-Encoding; } } set req.backend = default; if (req.http.host == "videostream.mysite.com") { return (pipe); if (req.url ~ "\.(jpeg|jpg|png|gif|ico|swf|js|css|txt|gz|zip|rar|bz2|tgz|tbz|html|htm|pdf|mp4)$") { unset req.http.cookie; set req.grace = 5m; return(lookup); } return(pass); } sub vcl_fetch { set beresp.do_stream = true; }
Comment