Snort mailing list archives

Re: [Snort-users] Perfmonitor Issue


From: "Guillaume Daleux" <guillaume.daleux () abovesecurity com>
Date: Thu, 17 May 2012 13:11:33 -0400

Hi Abdel,

 

You need to change your compilation options and disable linux-smp-stats

 

--enable-dynamicplugin --enable-perfprofiling --enable-targetbased
--enable-ipv6 --enable-ppm --enable-gre --enable-static-daq=no
--enable-64bit-gcc=no 

 

Regards,

 

Guillaume DALEUX

 

 

From: Abdelmonaim Mokadem [mailto:abdelmonaim.mokadem () abovesecurity com]

Sent: Wednesday, May 16, 2012 2:11 PM
To: snort-users () lists sourceforge net; snort-devel () lists sourceforge net
Subject: [Snort-users] Perfmonitor Issue

 

Hi all,

I have an issue using the perfmonitor preprocessor for snort inline  to
provide the "Max performance snort stats" with the following parameters:

 

  preprocessor perfmonitor: time 300 pktcnt 5000 events max console

 

Here are the options used to launch snort :

 

        -A none \

        --dynamic-engine-lib "${SNORT_ENG}" 

        --dynamic-preprocessor-lib-dir "${SNORT_DYNPPDIR}"

        --dynamic-detection-lib-dir "${SNORT_DYNRULDIR}" 

        --daq-dir "${DAQ_DIR}" 

        -i "${INTERFACE}" 

        -c "${SNORT_CONF}" 

        --perfmon-file "${LOG_DIR}/snort.stats" 

        -l "${LOG_DIR}" 

        -Q

 

Since I'm using the "max " and  "console" parameters, my console should
display the results, based on the following code:

if(iFlags & MAX_PERF_STATS)

{

      .

      .

  LogMessage("uSeconds/Pkt\n");

  LogMessage("----------------\n");

  LogMessage("Snort:
%.3f\n",sfBaseStats->usecs_per_packet.usertime);

  LogMessage("Sniffing:
%.3f\n",sfBaseStats->usecs_per_packet.systemtime);

  LogMessage("Combined:
%.3f\n\n",sfBaseStats->usecs_per_packet.totaltime);

  .

  .

}

But it doesn't...

It doesn't print me the Snort Max Performance at all..

The usec_per_packet structure is filled when "GetuSecondsPerPacket"  is
called but it seems like we never enter in the "if" clause 

and when I try to debug with gdb, I can see that "iFlag" is always equal
to 0 for an unknown reason and since "MAX_PERF_STATS" is equal to 1, the
"if" test fail.

 

FYI, here are the options used to compile snort :

 

--enable-dynamicplugin --enable-perfprofiling --enable-linux-smp-stats
--enable-targetbased --enable-ipv6 --enable-ppm --enable-gre
--enable-static-daq=no --enable-64bit-gcc=no 

 

 

If someone has an idea about the origin of the problem here...

 

Regards,

 

Abdelmonaim Mokadem.   

 

 

 

------------------------------------------------------------------------------
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
_______________________________________________
Snort-devel mailing list
Snort-devel () lists sourceforge net
https://lists.sourceforge.net/lists/listinfo/snort-devel

Please visit http://blog.snort.org for the latest news about Snort!

Current thread: