Check out my visual testing tool https://diffy.website. It helps to save testing time during deployments and development.
Logstash is a great tool to centralize logs in your environment. For example we have several drupal webheads that write logs into syslog. It would be really nice to see those logs somewhere centrally to find out about your system's health status and debug potential problems.
In this article I would like to show how easy to start using logstash for local development.
First of all in order to run logstash you need to follow instructions http://logstash.net/docs/1.4.2/tutorials/getting-started-with-logstash.
Logstash has following concepts:
Tricky part comes when you need to install Elastic Search to store your logs and Kibana to view them. There is very nice shortcut for development purposes -- to use already built docker image for that.
I have found very handy to use https://registry.hub.docker.com/u/sebp/elk/ image.
So you need docker to be installed (http://docs.docker.com/installation/ubuntulinux/). Then you import docker image and run it.
sudo docker pull sebp/elk sudo docker run -p 5601:5601 -p 9200:9200 -p 5000:5000 -it --name elk sebp/elk
Now we have docker image working plus it has port forwarding to our localhost.
In order to send your logstash logs to elastic search you need to use elasticsearch output. Here is logstash configuration file example that can be run for testing.
input { stdin { } } output { stdout { codec => rubydebug } elasticsearch { host => "localhost" port => "9200" protocol => "http" } }
Now when you run logstash and enter couple of messages they will be fed to elasticsearch. Now you can open http://localhost:5601/ to see kibana in action.
Next step would be to set up your own rules of extracting drupal (or any other type) logs and pushing them to elastic search. But this is very individual task that is out of the scope of this guide.