Security breaches happen. And when they do, your server logs may be your best line of defense. Hadoop takes server-log analysis to the next level by speeding and improving security forensics and providing a low cost platform to show compliance.
In this demo, we demonstrate how an enterprise security breach analysis and response might be performed.
This Hadoop tutorial can be performed with the Hortonworks Sandbox – a single-node Hadoop cluster running in a virtual machine. Download to run this and other tutorials in the series. The tutorial presented here is for Sandbox v2.0
Server logs are computer-generated log files that capture network and server operations data. They are useful for managing network operations, especially for security and regulatory compliance.
IT organizations use server log analysis to answer questions about:
In this tutorial, we will focus on a network security use case. Specifically, we will look at how Apache Hadoop can help the administrator of a large enterprise network diagnose and respond to a distributed denial-of-service attack.
Apache NiFi is a secure integrated platform for real time data collection, simple event processing, transport and delivery from source to storage. It is useful for moving distributed data to and from your Hadoop cluster. NiFi has lots of distributed processing capability to help reduce processing cost and get real-time insights from many different data sources across many large systems and can help aggregate that data into a single, or many different places.
NiFi lets users get the most value from their data. Specifically NiFi allows users to:
How NiFi Works. NiFi’s high-level architecture is focused on delivering a streamlined interface that is easy to use and easy to set up. There is a little bit of terminology that are an integral part to understanding how NiFi works.
A FlowFile can originate from a processor in NiFi. Processors can also receive the flowfiles and transmit them to many other processors. These processors can then drop the data in the flowfile into various places depending on the function of the processor.
To refine and visualize server log data, we will:
We’ll be using a python script to generate the server log data. SSH into the sandbox with the command
ssh root@localhost -p 2222
Default Sandbox Login
Or you can choose to use the Sandbox’s built-in Web-based SSH terminal Shell-In-A-Box which can be accessed at http://sandbox.hortonworks.com:4200
Remember the username is
root and the password is
After you log in, the command prompt will appear with the prefix
An exmaple for the output of these commands is below
First thing’s you’ll need to do is to make sure you’ve downloaded the gzipped version of Hortonworks DataFlow
Once you’ve downloaded HDF let’s get it on the sandbox. If you’re on a Mac or Unix system with the scp command available on your terminal you can simply run
scp -P 2222 $HDF_DOWNLOAD root@localhost:/root/
If you’re on a windows system you can use the program WinSCP to transfer files to the Sandbox.
After sending the HDF file to the Sandbox make sure you SSH into the Sandbox using the instructions from step 1.
Now that we have SSH’d into the sandbox we can run the following set of commands to set up and install HDF.
You can copy and paste these commands below, just make sure to first set the correct
HDF_VERSION environment variables for the version of HDF that you downloaded.
export HDF_FILE=HDF-220.127.116.11-91.tar.gz export HDF_VERSION=HDF-18.104.22.168 cd /root mkdir hdf mv $HDF_FILE ./hdf cd hdf tar -xvf $HDF_FILE cd $HDF_VERSION/nifi sed -i s/nifi.web.http.port=8080/nifi.web.http.port=6434/g conf/nifi.properties cd bin/ sh nifi.sh install cd ~
Great! HDF is now set up for our needs. You can now start NiFi with the following command:
service nifi start
First, we’ll need to open up the NiFi interface in our web browser. During installation we set the port that NiFi listens on to
6434. You’ll need to forward this port in the virtual machine settings.
For a guide on forwarding a port on your VM please see the guide in this tutorial
After forwarding port
6434 for NiFi you should be able to access the interface at https://localhost:6434/nifi
It should look something like below:
We’re going to import a pre-made data flow from a template which you can download here.
Use the NiFi inteface to upload the flow, and then drag it onto your workspace.
Once you’ve uploaded the template into NiFi you can instantiate it by dragging the template icon onto the screen. It will ask you to select your template’s name and the flow will appear as in the image below.
Now that you’ve imported the data flow and everything it set up, simply click the Run at the top of the screen. (Make you you haven’t selected a specific processor, or else only one of the processors will start)
Now that everything is running we can check in the places where we see the data being deposited in HDFS.
Log into the Ambari interface which can be found at http://localhost:8080
Open up the HDFS Files view, and then navigate to
/tmp/server-logs/. Files should start appearing a few seconds after you start the flow. You can click on them to view the content.
Open the Ambari UI and head to the views dropdown list. Select Hive and then paste the following query.
CREATE TABLE FIREWALL_LOGS(time STRING, ip STRING, country STRING, success BOOLEAN) ROW FORMAT DELIMITED FIELDS TERMINATED BY '|' LOCATION '/tmp/server-logs';
Note if the query doesn’t run successfully due to a permissions error you then you might need to update the permission on the directory. Run the following commands over SSH on the Sandbox
sudo -u hdfs hadoop fs -chmod -R 777 /tmp sudo -u hdfs hadoop fs -chown -R admin /tmp
When the table has been created you should now be able to query the data table for data using a query like
Select * from FIREWALL_LOGS LIMIT 100;
In this section, we will use Excel Professional Plus 2013 to access the generated server log data. Note: if you do not have Excel 2013, you can still bring the data into other versions of Excel and explore the data through other charts. The screens may be slightly different in your version, but the actions are the same. You can complete Step 5 and then explore the data on your own.
On the Choose Data Source pop-up, select the Hortonworks ODBC data source you installed previously, then click OK.
The Hortonworks ODBC driver enables you to access Hortonworks data with Excel and other Business Intelligence (BI) applications that support ODBC.
Now that we have successfully imported Hortonworks Sandbox data into Microsoft Excel, we can use the Excel Power View feature to analyze and visualize the data.
Data visualization can help you analyze network data and determine effective responses to network issues. In this section, we will analyze data for a denial-of-service attack:
We’ll start by reviewing the network traffic by country.
The Power View Fields area appears on the right side of the window, with the data table displayed on the left.
Drag the handles or click the Pop Out icon to maximize the size of the data table, and close the Filters area.
It's obvious that this is a coordinated attack, originating from many countries. Now we can use Excel to generate a list of the unauthorized IP addresses.
Use the tabs at the bottom of the Excel window to navigate back to the Excel worksheet with the imported “
Click the arrow next to the status column header. Clear the Select all check box, select the ERROR check box, then click OK.
We’ve shown how the Hortonworks Data Platform can help system administrators capture, store, and analyze server log data. With real-time access to massive amounts of data on the Hortonworks Data Platform, we were able to block unauthorized access, restore VPN access to authorized users.
With log data flowing continuously into the Hortonworks Data Platform “data lake,” we can protect the company network from similar attacks in the future. The data can be refreshed frequently and accessed to respond to security threats, or to prepare for compliance audits.
First, make sure that the Apache Zeppelin service is started in Ambari. Then use the Views Dropdown Menu and select the Zeppelin View.
You should be greeted by the following screen where you can choose to view notes, or create a new one.
You can choose to import the note from this tutorial using the following URL:
Once you’ve opened the note you can use the following commands to generate charts to visualize the data
%hive select country from firewall_logs
%hive select time, country from firewall_logs