Home Audience Developers Network Monitoring and Programming Using Python

Network Monitoring and Programming Using Python

1
35792

man working with python programmingPython scripts and APIs can be tailor made into effective network monitoring and forensics tools. Their versatility makes them ideal in assorted applications including cyber security, data mining, Internet of Things, cloud simulation, grid implementation, etc.

Network monitoring and digital forensics are some of the prominent areas in the domain of cyber security. There are a number of software products and tools available in the technology market to guard network infrastructure and confidential data against cyber threats and attacks.

For a long time, the monitoring of servers and forensic analysis of network infrastructure has been done using packet capturing (PCAP) tools and intrusion detection systems (IDS). These activities are performed using PCAP and IDS tools available in the market, which include open source software as well as commercial products.

Despite the number of tools available for packet capturing and monitoring, professional programmers prefer to use their own software developed by coding and scripting. Self-developed and programmed code offers a lot of flexibility in customising the tool. Many organisations concerned about security, confidentiality and integrity, choose not to use any third party software. Rather, they develop their own tools using efficient and highly effective programming languages, which include Python, Java, PERL, PHP and many others.
Python is one of the widely used languages for writing the special scripts for packet capturing, classification and machine learning.

It should be mentioned that a great deal of network monitoring and logging software has been developed in Python. Shinken and Zenoss are common tools used for monitoring the hosts, network data collection, alerts and messaging, and include lots of active and passive monitoring methods. Currently, Shinken, based on Python, is the open source framework used for monitoring. This software can perform a large set of operations related to digital forensics and logging.

Python scripts and libraries for network forensics

Here is a list of tools built with Python for network monitoring, logging, high security credential management and performance evaluation.

Eddie

The features of this tool include:

  • System monitoring checks
  • File system checking
  • HTTP checks
  • POP3 tests
  • SNMP queries
  • RADIUS authentication tests
  • Monitoring of processes
  • System load
  • Network configuration
  • Ping checks
  • Customised TCP port checks
  • Watching files for changes
  • Scanning of log files

Pypcap (Packet Capture Library)

Pypcap is a Python wrapper with object-oriented integration for libpcap. It can be installed easily:

$ pip install pypcap

Using Python pcap, the packets can be captured with the following few lines of code:

>>> import pcap
>>> for ts, pkt in pcap.pcap():
print ts, `pkt`

LinkChecker

Recursive and deep checking of server pages can be done using the LinkChecker library in Python. Site crawling is made easy with its features for integrating regular expressions and filtering using LinkChecker. The output can be generated in multiple formats including HTML, XML, CSV, SQL or simply the sitemap graph.

WebScraping

Python is used by researchers and practitioners for collecting live data for research and development. For example, we can fetch live records of the stock market, the price of any product from e-commerce websites, etc. Data collected in this way forms the foundation of Big Data analytics. If a researcher is working on Big Data analysis, the live data can be fetched using a Python script and can be processed based on the research objectives.
Here is a code snippet written in Python to fetch live stock exchange data from the website timesofindia.com:

from bs4 import BeautifulSoup
import urllib.request
from time import sleep
from datetime import datetime
def getnews():
url = “http://timesofindia.indiatimes.com/business”
req = urllib.request.urlopen(url)
page = req.read()
scraping = BeautifulSoup(page)
price = scraping.findAll(“span”,attrs={“class”:”red14px”})[0].text
return price
with open(“bseindex.out”,”w”) as f:
for x in range(2,100):
sNow = datetime.now().strftime(“%I:%M:%S%p”)
f.write(“{0}, {1} \n “.format(sNow, getnews()))
sleep(1)

Fetching live data from social media

In the same way, Twitter live feeds can be fetched using Python APIs. Using a Twitter developer account, a new app can be created and then a Python script can be mapped with the Twitter app, as follows:

from tweepy import Stream
from tweepy import OAuthHandler
from tweepy.streaming import StreamListener

#setting up the keys
consumer_key = ‘XXXXXXXXXXXXXXXXXXXX’
consumer_secret = ‘ XXXXXXXXXXXXXXXXXXXX ‘
access_token = ‘ XXXXXXXXXXXXXXXXXXXX ‘
access_secret = ‘ XXXXXXXXXXXXXXXXXXXX ‘

class TweetListener(StreamListener):
# A listener handles tweets are the received from the stream.
# This is a basic listener that just prints received tweets to standard output

def on_data(self, data):
print data
return True

def on_error(self, status):
print status

#printing all the tweets to the standard output
auth = OAuthHandler(consumer_key, consumer_secret)
auth.set_access_token(access_token, access_secret)

stream = Stream(auth, TweetListener())
stream.filter(track=[‘research’])

Using this Python code, the keyword ‘research’ is extracted from Twitter and the output is sent in JSON format. JSON (JavaScript Object Notation) is a special file format that is used by many NoSQL and unstructured data handling engines. Once the JSON is obtained, the required knowledge can be extracted by using the Refine or any other tool, and further predictions can be made.

1 COMMENT

LEAVE A REPLY

Please enter your comment!
Please enter your name here