Firewall Wizards mailing list archives

Re: [Theory] Time for a new FWTK? (long)


From: Rick_Giering_at_mpg003 () ccmailgw mcgawpark baxter com
Date: Mon, 1 Dec 1997 14:54:35 -0600

     With all of the comments on a new FWTK and such, I couldn't hold back 
     my two cents. This is pretty long and borders on a rant. For those who 
     don't want to read it, sorry for wasting bandwidth. For those who do, 
     your comments are welcome.
     
     "Marcus J. Ranum" <mjr () nfr net> at Internet said:
     
     >I'm pretty much done with firewalls. :) The problem is that I don't 
     >know *HOW* to build the next generation of firewalls, and I don't 
     >want to build another of the previous generation. "been there, done 
     >that" repeatedly...
     
     This is an issue I've thought a lot about over the past 2.5 years. It 
     started when I became responsible for a firewall and two web servers 
     and had to deal with end users, application developers, and their and 
     my management. After much thought, gnashing of teeth, and high blood 
     pressure, I've come to the conclusion that the "next generation" can't 
     be built (or at least without major changes to other things outside of 
     the IT/Internet arena).
     
     [snip]
     >FWTK was good while it lasted; time to move on.
     
     I agree and would like to thank you for all of your work. I shudder to 
     think where things would be without a benchmark like the FWTK. It, at 
     least, got people thinking along the right track!
     
     <"Next generation Firewall" discussion start>
     
     To me, a firewall is supposed to :
        1) protect against private information flowing out 
        2) protect against malicious applets flowing in
        3) controlling what content internal users can access
        4) protect against malicious users gaining access inside.
        5) protect against DOS attacks on machines available for public 
     use.
        6) Add your own "supposed to" here...
     
     #1, #4, and #5 are some of the original reasons the firewall concept 
     was created. 
     
     #2 above includes program and macro virus's and is independent of the 
     transport used. For example, it covers virus's delivered via email, web 
     and ftp downloads, etc.
     
     #3 above is the standard "don't view porno on the company asset" 
     argument.
     
     I'm willing to be shown wrong but I believe the current generation of 
     firewalls is focused on tracking and understanding the application 
     level protocols and the data that flows through them. Good examples 
     are SMTP, FTP, and HTTP. This even covers "stateful firewalls." 
     Failing this, the only option is exclusion by IP address. I believe 
     MJR touched on this in one of his posts. 
     
     This might have worked at one time when there were a few "standard" 
     protocols that were fairly simple but not today. And, this approach 
     won't work in the future as more and more developers use RPC 
     technology instead of a simple ASCII conversation style protocol. I'm 
     dreading the day when CIFS (ie MS File sharing) is a "standard" and 
     people will expect Firewalls to protect them.
     
     I got a step ahead of myself above. Lets talk about the future 
     environment the next generation of firewalls are expected to protect 
     and deal with. Here's my guess's (... pulls crystal ball from under 
     desk ...)
     
        1) Users and their management will continue to ignore security just 
     like they ignore power, water, air conditioning and other "facility" 
     kinds of things. Therefore, it'll continue to be under funded, and an 
     "add-on" function to someone's duties.
        2) Business Management (higher than IT management!) will continue 
     to view data security as an IT concern instead of a true business 
     concern. This view will flow down through middle management to the 
     "troops."
        3) Users will continue to want "cool" and useful applications/applets 
     without having to worry about security issues. If security does get in 
     the way, it'll be sacrificed in order to get the applet to work.
        4) Developers will continue to focus on providing "cool" and useful 
     applications without (or with very little) regard to security issues.
        5) Most/all applications/applets will be embedded with "scripting" 
     ability including email! Yes, the "Good Times" hoax (ie. just reading a 
     message will cause a virus attack) will then be possible.
        6) Using RPC means the end of "well known ports" for most 
     applications and the rise in importance of the "port mapper" port. The 
     other alternative is the "smtp.tcp.domain." in DNS model (sorry, I don't 
     know the true name) which just changes the port mapper function into a 
     DNS/directory function. Note: this DNS/directory function will quickly have 
     to handle different versions like the port mapper model does!
        7) Every little app will have their own set of RPC programs 
     requiring an automated and transparent method of upgrading both the client 
     and server sides of an application. (ala' Pointcast!)
        8) Sandboxes for applets will be punched with enough holes in order 
     to make them user friendly and useful that the Sandbox concept will be 
     useless except for specialized uses special consumer type hardware that 
     doesn't interface with much. 
        9) IPv6 and IPSec will only make the firewalls job tougher as 
     developers use its security features as an excuse to "go around the 
     firewall."
     
     The following is more of a timeline than static predictions....
        A) Developers will continue to use existing protocols as transports 
     for their applications just like Pointcast. These will deliver both 
     content and applets (macros, Java, JScript, VBScript, ActiveX, etc.).
        B) Firewalls will try to recognize these uses and add controls for 
     them. (revoking JAVA, detecting macro "virus's", etc.)
        C) Developers will move to encryption/compression to both protect their 
     content and applets and as a way to defeat these controls. (ie. has anyone 
     tried to detect stuff coming through an SSL pipe? I don't think so)
        D) Once MicroSoft makes it brain dead easy to develop client 
     server apps using RPC (probably using COM/DCOM), Developers will move 
     to it very quickly. The result will be many holes punched through a 
     firewall; one for each application/version. 
     
     Has anyone thought seriously about trying to firewall/proxy RPC with 
     an eye toward analysis and the "firewall supposed to's" above? I 
     don't think so. If they did, I think they'd either quickly get a 
     headache or their brain would explode!
     
     Not a very pretty picture. I don't think a next generation of 
     firewall can be built because the knowledge of the protocols used 
     (mainly RPC based) will be both proprietary and quickly changing. The 
     use of automatic upgrades will eliminate most compatibility problems 
     but will leave huge holes in a companies security environment.
     
     Regarding Neural Nets and Fractal analysis: These are good ideas but I 
     suspect won't prove useful. I think their major failing will be 
     determining exactly what's wrong in a timely enough manor to act. I 
     have only limited knowledge in these areas and am open to discussion.
     
     This is already too long, so I'll cut it short. I do have a concept 
     that addresses these issues but it's pretty radical and I doubt I'll 
     see it for a great many years. It involves users and developers 
     mindsets, network, host, and application security, and how data exists 
     in and out of an OS. If anyone is interested, we can discuss it.
     
     <"Next generation Firewall" discussion end>
     
     Comments anyone?
     
     Rick Giering
     Note: These opinions are my own and have nothing to do with my 
     employer.
     
     
     
     
     



Current thread: