Using Open Source Logging and Monitoring Tools

Speakers: 

Many of the daemons that work to produce your website are producing logs and be easily monitored to produce metrics. In a production setup, you may have many web servers and multiple database servers. Shipping logs and metrics from multiple servers and parsing them into a searchable format can help you and your team keep an eye on what’s happening and infer trends without needing to hop onto server and work with tools like awk and sed.

A number of open-source tools can help you easily collect metrics, ship and parse logs, and present them all in an easy format that is usable by mere mortals. During the session, we’ll discuss how collectd can be used to collect system-level metrics and ship them to Graphite for analysis, and how to use Logstash to ship and parse logs from all the common daemons in a Drupal hosting stack. We’ll also touch on how you can add extra information so that, for example, you get millisecond duration for all your Apache, nginx, and Varnish logs, and can log Varnish hits and misses.

We’ll also discuss Kibana and Grafana, two great applications that allow you to display and query ElasticSearch and Graphite data, respectively. We’ll also spend a little time talking about how you can use Logstash to ship trend data from your logs to statsd and then into Graphite to get better snapshots of your data without needing to search.

Finally, you’ll also learn how we use the ELK (ElasticSearch, Logstash, and Kibana) stack on drupal.org to manage billions of logs a month, and how it has helped us track down and fix errors from the drupal.org Drupal 7 upgrade.

Concrete takeaways:

  • Learn how the logging infrastructure of drupal.org runs with fully free and open-source software and how you can apply the same tooling to your own sites
  • Get in-depth information on how you can use Logstash, ElasticSearch, and Kibana to ship, parse, and search logs to look for trends and anomalies
  • How you can use Kibana to graph, sort, search, and trend non-Logstash data
  • How to ship and parse Drupal watchdog logs in order to keep an eye on PHP errors and Drupal module messages
  • Learn about collectd, graphite, Grafana, statsd, and how they can be used to track system stats over time
Schedule info
Track: 
DevOps
Experience level: 
Intermediate
Drupal Version: 
N/A
Time slot: 
Wednesday · 14:15-15:15
Room: 
G103 · Rackspace

Comments