Snort mailing list archives
Re: HELP ON SNORT
From: Dustin Webber <dustin.webber () gmail com>
Date: Mon, 30 Jan 2012 10:48:55 -0500
@beenph Not sure I fully understand.. are you suggesting that everyone should purchase products because the open source software will never compare? Sorry, I just don't fully understand. @martin I need to dig into ELSA a bit more but from what I saw you were using mysql for the backend datastore. I don't want to come off as skeptical on the benchmarks but mysql was not really designed for this type of data, speed and write heavy situations. What kind of optimisations are you doing on the DB side? How are you doing these complex searches in so little time without map/reduce, bloomfilters, or an optimized data store to release memory rapidly for index storage? I do a lot of `big data` research on my spare time and honestly I don't think I could hit that mark unless I did master/slave and then replicated data from a write box to a read only box. (keep in mind this is assuming the backend is mysql with EPIC amounts of memory) How are you building time based metrics on 1 million records in less then a second? This all sounds pretty intense.. i'll pull the source and give it a go. Dustin W. Webber Dustin.Webber () gmail com On Mon, Jan 30, 2012 at 10:17 AM, Martin Holste <mcholste () gmail com> wrote:
This is a fantastic thread; I've really appreciated everyones responses.Listen, if someone can figure out how to scale this DB scheme to thatamountof raw data and still build time based metrics.. You should be workingforthe Google R&D team.. Because you obviously figured out and to do quantum storage / processing.You've hit a key point here that I want to delve into a bit. The current schema is built so that a lot of packet details are stored in the database, such as TCP flags. The vast majority of the time, the analyst is not concerned with any of these things. In fact, I'd wager that the analyst is almost always more concerned with context information, such as URL accessed or email subject line. For that reason, I built ELSA and we use it as our IDS console, and we process more than 1 million events per minute (with time-based metrics!). Since we can handle any amount of alerts and our searches finish in milliseconds, we spend very little time tuning. We only parse out the basic Snort fields (msg, proto, IP's, ports, classification, sid) for reporting, but we log every URL into the same system. That way when you do a search or drill-down, the URL shows up next to the Snort alert. To see the contents, we use StreamDB, which hooks directly into ELSA and shows the entire conversation in plain-text in a click with no waiting. Yes, this is shamelessly plugging my projects, but I'm doing it here to make some very important points. The current stock schema: - Is designed for those not doing full packet capture. - Devotes many resources to details not typically useful in incident response. - Lacks any kind of flexibility. - Has no concept of workflow. - Requires table joins, which become performance bottlenecks. The best possible console with the least amount of work would be to have Snort log syslog to Splunk personal edition, and have the HTTP log that Snort creates also log there. Send whatever logs you want there, it's all linear benefit. If the personal edition becomes too small, buy a license or download ELSA, which is faster but isn't as shiny or as easy to install. If you're just trying to see what alerts you're getting and do ad-hoc reporting and alerting, there is absolutely no better way than Splunk personal edition. If you absolutely have to have the raw packet viewable right underneath the text of the alert, then you are probably "advanced" and you should really invest the time in installing Snorby via SecOnion, because you're eventually going to want to use the other tools in SecOnion. If you're not "advanced," then you probably don't know what those packet details mean anyway, but you'll probably know whether you've got a real incident based on the URL involved, so why not focus on getting that info as close to your alerts as possible?
------------------------------------------------------------------------------ Try before you buy = See our experts in action! The most comprehensive online learning library for Microsoft developers is just $99.99! Visual Studio, SharePoint, SQL - plus HTML5, CSS3, MVC3, Metro Style Apps, more. Free future releases when you subscribe now! http://p.sf.net/sfu/learndevnow-dev2
_______________________________________________ Snort-users mailing list Snort-users () lists sourceforge net Go to this URL to change user options or unsubscribe: https://lists.sourceforge.net/lists/listinfo/snort-users Snort-users list archive: http://www.geocrawler.com/redir-sf.php3?list=snort-users Please visit http://blog.snort.org to stay current on all the latest Snort news!
Current thread:
- Re: HELP ON SNORT, (continued)
- Re: HELP ON SNORT Paul Halliday (Jan 30)
- Re: HELP ON SNORT beenph (Jan 30)
- Re: HELP ON SNORT Jefferson, Shawn (Jan 30)
- Re: HELP ON SNORT Lay, James (Jan 30)
- Re: HELP ON SNORT Jeremy Hoel (Jan 30)
- Re: HELP ON SNORT Dustin Webber (Jan 30)
- Re: HELP ON SNORT beenph (Jan 29)
- Re: HELP ON SNORT Dustin Webber (Jan 30)
- Re: HELP ON SNORT beenph (Jan 30)
- Re: HELP ON SNORT Martin Holste (Jan 30)
- Re: HELP ON SNORT Dustin Webber (Jan 30)
- Re: HELP ON SNORT beenph (Jan 30)
- Re: HELP ON SNORT Martin Holste (Jan 30)
- Re: HELP ON SNORT Dustin Webber (Jan 30)
- Re: HELP ON SNORT Carney, Megan (Jan 30)
- Re: HELP ON SNORT Rich Graves (Jan 31)
- Re: HELP ON SNORT Jeremy Hoel (Jan 29)
- Re: HELP ON SNORT Scott Runnels (Jan 29)
- Re: HELP ON SNORT Jeremy Hoel (Jan 29)
- Re: HELP ON SNORT Heine Lysemose (Jan 29)
- Re: HELP ON SNORT Eric G (Jan 31)