This is my final article about SEO Dashboard where I show you how to use Kibana with PaaS Logs ( OVH ). I added a bonus at the end.
1/ Install Kibana
Get the last Kibana 4.5.X for Elasticsearch 2.x here : https://www.elastic.co/downloads/kibana (4.5.1 at the time of writing)
Unzip the archive anywhere on your machine.
2/ Configure Kibana
To configure Kibana, edit config/kibana.yml and set the following properties:
– elasticsearch_url : https://laas.runabove.com:9200
– kibana_index : kibana-ra-logs-XXXXX
– kibana_elasticsearch_username : your login
– kibana_elasticsearch_password : your paasworrd
# Kibana is served by a back end server. This controls which port to use. port: 5601 # The host to bind the server to. host: "0.0.0.0" # The Elasticsearch instance to use for all your queries. elasticsearch_url: "https://laas.runabove.com:9200" # preserve_elasticsearch_host true will send the hostname specified in `elasticsearch`. If you set it to false, # then the host you use to connect to *this* Kibana instance will be sent. elasticsearch_preserve_host: true # Kibana uses an index in Elasticsearch to store saved searches, visualizations # and dashboards. It will create a new index if it doesn't already exist. kibana_index: "kibana-ra-logs-XXXXX" # If your Elasticsearch is protected with basic auth, this is the user credentials # used by the Kibana server to perform maintence on the kibana_index at statup. Your Kibana # users will still need to authenticate with Elasticsearch (which is proxied thorugh # the Kibana server) kibana_elasticsearch_username: "ra-logs-XXXXX" kibana_elasticsearch_password: "XXXXXXXXXXXXXXXX"
Run Kibana from the install directory: bin/kibana (Linux/MacOSX) or bin\kibana.bat (Windows).
3/ Add your index
Kibana is now running on port 5601.
Before you can start using Kibana, you need to tell it which Elasticsearch indices you want to explore. The first time you access Kibana, you are prompted to define an index pattern that matches the name of one or more of your indices.
If you have forgotten your index, you can find it here in the Paas Logs interface :
4/ Import my SEO dashboard
The first step is to download my JSON file.
You need to open this file with text editor and replace my index ( here : logsDataSEOfr ) by yours.
Go to “Settings” > “Objects”. Click on the “Import” button and choose the JSON file. Wait a few seconds and SEO Dashboards are ready.
Now, you can use SEO dashboards that you can modify and improve. To see a Dashboard, click on the Dashboard tab; then click on this icon to open a dashboard.
I have listed some functionalities but you can personalize as you wish :
- Metric : Crawled Urls
- Data table : Top crawled urls, Bottom crawled urls
- Vertical bar chart : Content length by pagetype segments
- Metric : Crawled urls, Compliant urls, Not compliant urls, Average depth
- Data table : Top not compliant urls
- Pie chart :
- Vertical bar chart : Compliant urls by depth, Compliant urls by pagetype segments, Urls by depth and pagetype segments
- Metric : Urls crawled by Google, Active Urls crawled by Google, Crawled Urls
- Pie chart : IP Googlebot
- Vertical bar chart : GoogleBot crawls by day, Googlebot by pagetype segments, Active urls on Google by depth, Compliant urls on Google by depth
Dashboard HTML Tags
- Metric : Crawled urls, Compliant urls, Compliant urls with bad title
- Data table : Top Bad Title
- Vertical bar chart : Title performance by day, Title performance by pagetype segments, H1 performance by day, H1 performance by pagetype segments, Description performance by day, Description performance by pagetype segments
Dashboard HTML Codes
- Metric : 2XX Urls, 3XX Urls, 4XX Urls, 5XX Urls
- Data table : Top 4XX, Top 5XX
- Vertical bar chart : Urls by pagetype segments and http status code, Urls by depth and page type segments
- Metric : Crawled urls, Fast Urls, Medium Urls, Slow Urls
- Pie chart : Load Time Distribution
- Vertical bar chart : Load Time Performance by depth, Load Time Performance by Pagetype Segments, Load Time Performance by status code
I have created a project which can autoinstall Kibana and Filebeat for Windows and you can directly send your log files and CSV file from your computer.
1/ Install or clone my project SEO-Dashboard by keeping this directory “C:\Projects\SEO-Dashboard” to avoid changes in configuration files
2/ Launch build.bat to auto install Kibana and Filebeat in “dist” directory
3/ Change theses three lines : dist/kibana/conf/kibana.yml
4/ Change this line : dist/filebeat/filebeat.yml
This line is given in your Input section of Paas Logs
5/ Now, you just need to launch 3 files :
- run-filebeat.bat : Launch Filebeat and monitor these two directories: filebeat-csv et filebeat-logs. If you copy files, all data will be sent to Paas Logs. First, I advise you to copy your CSV file.
- run-kibana.bat : Launch Kibana , go to http://localhost:5601
- run-r.bat : Generate your CSV file from you Screaming Frog crawl without RStudio.
If you don’t want to modify my R file, you need to copy two files to /websites/data-seo
- segments.csv : your pattern urls ( line by line )
- internal_html.xlsx : your crawl from Screaming Frog ( see my first article about R ) .
This is my last article about SEO Dashboard. It gives you all the technical solutions to have your own log file analyser with R, Screaming Frog, Kibana and Filebeat.
I worked with R and ScreamingFrog, but you can do the same thing with another crawler and your prefered language. This is my GIT depot if you want to test: https://github.com/voltek62/SEO-Dashboard
Do not hesitate to use comments to request my help and fork my project to improve it.
– Pierre de Paeppe, Tech Lead at OVH who helped me.
– Thibault Willerval ( SEO Consultant Rouge Interactif : @thibaultwillerv ) and Nicolas Augé ( SEO Consultant Resoneo : @nseauge ) who tested my log file analyser.
– Julien Deneuville and Nicolas Chollet ( SEO Agency : @0neclick ) who gave me valuable piece of advices.
– Sylvain Peyronnet ( @speyronnet ) and QueduWeb Team who helped me present this new project.