Please Read before you bid:
********** If you do not understand the contents of a "packet", packet decoding, snort or hadoop - please do not submit a bid *************
1. SNORT – Packet Decodes – Packet Captures – PCAPS – Wireshark - TcpDump - raw packets -
2. JAVA / PIG / PIGLATIN / JAVA SCRIPT / Python /
3. Cloudera/Apache Hadoop / Map reduce / HDFS / Amazon EC / EMR / ozzie / Hive / HBase
This is a three phase project. You should be bidding on phase one only, The project award for each phase is independent of other phases, if winning programmer provides exactly what is requested they will be given preferential treatment for next two phases of this project.
Brief: Graphical Packet Capture Analyzer tool, with functionality like Xplico which is a Network Forensic Analysis Tool , also utilizes the Snort back end to identify threats, must have all the functionality and capability of xplico, snort and several other packet analyzers combined.
Details: Design/Build Web Front End Graphical interface with widgets, graphics, sizeable, zoom-able (initial 20 pages) to various java, php and pig scripts, must run/interact with a Hadoop Cluster hdfs and a Amazon EC cluster with specific scripts to analyze data stored in packet captures, decoding packet captures -ie: PCAPS (Raw packet captures) and parse out specific parts of data, correlate data and display specifically components in a specific manner (drawing of page included) Decode packet captures, ability to group of pcaps.
All widgets on the webpages will operate Scripts that run against various open source projects such as snort, suricata, p0f, graphGL, emr, Trigram cube and a few others. Programmer will be responsible for completing all aspects of "Phase I" before payment is made, Programmer must sign an NDA as well and provide all source code, scripts, works to me on or before completion of the project. I require a initial Skype conversation prior to awarding this project.
1. Data is defined as "Packet Capture" Raw PCAPS (these will be provided to you for testing) 100mb to 100tb in size
2. Setup a Hadoop Cluster is Cloundera 4 - CDH4 Cluster (4-6 nodes) on VM systems that will be provided
3. Install and Configure the following, Suricata or Snort, P0f, dns ubigraph, unigram, Choropleth, pig, apache, and various other packages and tools (as defined*) as part of the package installation *(details provided)- Specific packages will be provided (by name/source with exact requirements for proper execution and operation
3. Advanced Web Front end (start with 20 various pages to +20 pig / java scripts - that will run against data, where the data resides on a hadoop hdfs cluster. These scripts will be run (mapreduce/Hive/HBase/Oizzie) or other required function to interact with the "data" stored on a large Hadoop Cluster.
4. Rewrite several (20) (provided) open source python, pig, php and java scripts to perform additional functionality and features (as defined) must present data from specific pcaps in a specific manner on each of these web pages, all page outlines will be provided and quality, functionality, flow, features, usability must match this exactly.
5. Create 10 new pig/java/php scripts to perform specific functionality and interactions between the hadoop (PCAP data) and display as data output on the webpages, script to add data (pcaps) to the Hadoop cluster, tracking script to search and display certain fields from each packet capture –consisting of thousands of packets. Build traffic analytics to indentify trends and patterns in /with in the pcap) as well as source/destination, ip , ports, mac, protocols, streams and must be able to extract all data (files) located in the packet capture for reassembly (graphics, scripts, webpages)
Awarded winner will be provide with the graphical pages, the initial installation details, the initial scripts and some specific sites (showing examples) of exactly how this system should function and feel.
8 фрилансеров(-а) в среднем готовы выполнить эту работу за $668
Hi,we have gone through with the [login to view URL] can easily do it with [login to view URL] provide you regular updates [login to view URL] availble for you all the [login to view URL] work together. Regards