Difference between pages "Package:Varnish" and "Package talk:Nginx"

From Funtoo
(Difference between pages)
Jump to navigation Jump to search
m
 
m
 
Line 1: Line 1:
{{Ebuild
things i would like to know:
|Summary=Varnish is a state-of-the-art, high-performance HTTP accelerator
*vhosts (with subdomain examples)
|CatPkg=www-servers/varnish
*ssl
|Maintainer=
*ssl termination for varnish + x forward for settings so web apps produce clients ip addys instead of 127.0.0.1
|Homepage=https://www.varnish-cache.org/
*email, pop3 + smtp + imap
}}
*probably wsgi for python peoples
'''Varnish''' is a webcache & http accelerator. Varnish will either serve cached content, or retireve content from the server, and cache it. Varnish cache will reduce I/O pressure from webservers.
*cgi for perl zombies
change the webrick line to include 'or puma' on port 3k.... and point to [[Package:Ruby#puma]]


==Install==
* explore the webapp firewall security flag::  nginx_modules_http_security
===Emerge===
* explore geoip:: nginx_modules_http_geoip
* explore fancy index: nginx_modules_http_fancyindex
[[User:Threesixes|Threesixes]] ([[User talk:Threesixes|talk]]) 22:33, October 23, 2014 (UTC)


Install {{Package|www-servers/varnish}}:
<console>###i## emerge www-servers/varnish</console>


==Configuration==
{{PageNeedsUpdates}}


{{note|as your varnish is local to your server, your server needs to be aware that it is behind a proxy, and configured for [http://en.wikipedia.org/wiki/X-Forwarded-For x-forwarded-for] or similar to fetch real users ip addresses instead of displaying 127.0.0.1 or localhost.}}
add expires headers for caching static content:
/etc/nginx/sites-enabled/localhost
expires    24h;
...


Configuration is controlled by /etc/varnish/default.vcl & /etc/conf.d/varnishd
add cache control headers:
 
/etc/nginx/sites-enabled/localhost
{{file|name=/etc/varnish/default.vcl|desc=varnish configuration file|body=
add_header Cache-Control "public";
vcl 4.0;
...
backend default {
[[User:Threesixes|Threesixes]] ([[User talk:Threesixes|talk]])
    .host = "127.0.0.1";
    .port = "8080";
}
}}
 
{{file|name=/etc/conf.d/varnishd|desc=varnish configuration file|body=
VARNISHD="/usr/sbin/varnishd"
VARNISHADM="/usr/bin/varnishadm"
CONFIGFILE="/etc/varnish/default.vcl"
VARNISHD_OPTS="-a 127.0.0.1:80"
VARNISHD_OPTS="${VARNISHD_OPTS} -u varnish -g varnish"
}}
 
Varnish will fetch data from localhost:8080 and serve accelerated proxy data on localhost:80
 
=== [https://www.varnish-cache.org/docs/4.0/users-guide/increasing-your-hitrate.html Achieving a high hit rate] ===
 
=== c10k ===
For 10,000 concurrent connections (or not) a few configuration settings control the power of varnish.  If you're having varnish directly serve to the outside world, dial back concurrency to say 50 or 100 connections per ip. You must take into account corporations, and universities hammering several connections from a singular ip.  c10k is useful information for if varnish is behind a load balancer such as pound, nginx, or tengine, and all requests are internal and local.
 
{{file|name=/etc/conf.d/varnishd|desc=varnish concurrency settings|body=
VARNISHD_OPTS="-a 127.0.0.1:80 -p thread_pool_min=20 -p thread_pool_max=1000 -p thread_pool_add_delay=2 -s malloc,700M"
}}
 
{{file|name=/etc/varnish/default.vcl|desc=varnish concurrency settings|body=
backend default {
    .host = "127.0.0.1";
    .port = "8080";
    .connect_timeout = 600s;
    .first_byte_timeout = 600s;
    .between_bytes_timeout = 600s;
    .max_connections = 10000;
    # .port = "80" led to issues with competing for the port with apache.
}
}}
 
== BootService ==
 
To start varnish immediately:
<console>###i## rc-service varnishd start</console>
 
To start varnish at boot:
<console>###i## rc-update add varnishd default</console>
 
== Verification ==
To verify that your traffic is going through varnish, and see if you're getting hits or misses:
<console>$##i## curl -I http://www.funtoo.org/Welcome</console>
 
== Benchmarking ==
{{package|app-admin/apache-tools}} apache benchmark can show the power of varnish.  The examples shown are running 500 requests with concurrency of 100 hits.
 
 
ab against a 3 worker cluster mode puma server
<console>###i## ab -n 500 -c 100 http://127.0.0.1:3000/index.html | grep Request</console>
Requests per second:    110.92 [#/sec] (mean)
 
 
ab against the same server served through varnish
<console>###i## ab -n 500 -c 100 http://127.0.0.1/index.html | grep Request</console>
Requests per second:    10268.42 [#/sec] (mean)
 
== SSL support ==
Varnish does not support ssl.  There are packages to get around this limitation:
* {{package|net-misc/stunnel}}
* {{package|www-servers/pound}}
 
== Media ==
{{#widget:YouTube16x9|id=JEF6_XC-2ZU}}
 
{{EbuildFooter}}

Latest revision as of 07:29, December 19, 2014

things i would like to know:

  • vhosts (with subdomain examples)
  • ssl
  • ssl termination for varnish + x forward for settings so web apps produce clients ip addys instead of 127.0.0.1
  • email, pop3 + smtp + imap
  • probably wsgi for python peoples
  • cgi for perl zombies

change the webrick line to include 'or puma' on port 3k.... and point to Package:Ruby#puma

  • explore the webapp firewall security flag:: nginx_modules_http_security
  • explore geoip:: nginx_modules_http_geoip
  • explore fancy index: nginx_modules_http_fancyindex

Threesixes (talk) 22:33, October 23, 2014 (UTC)


add expires headers for caching static content: /etc/nginx/sites-enabled/localhost expires 24h; ...

add cache control headers: /etc/nginx/sites-enabled/localhost add_header Cache-Control "public"; ... Threesixes (talk)