Educause Security Discussion mailing list archives

Re: Effective Practice / Question/Expertise needed


From: jeff murphy <jcmurphy () BUFFALO EDU>
Date: Tue, 2 Jun 2009 17:32:15 -0400

I read his message as:

- access to sensitive data is very often accomplished via a browser these days - browsers have a lot of bells and whistles developed by pretty much anyone
- should that worry him?

My answer is "yes".

For the "router" piece, yes it's true we don't do a code review, but we have a company behind that product and a maintenance contract, and we trust that the company is somewhat sane in its development practices (given its profit motive and competitive orientation). We also recognize that not just anyone can write and release router code (and that plugins for routers). So statistically speaking (and what is risk, after all, but statistics), I don't think router v. browser is a fair comparison.

For the "browser" piece, it's a different beast. The browser code is accessible enough (the plugins in particular) that it deserves closer scrutiny. Fortunately, you can approach risk control by mechanisms other than code reviews (which require specialized skills and due to the diversity of plugins, probably isn't scalable).

My opinion is that browser instances that access sensitive information should not be the same as browser instances that you access youtube, cnn, etc, with. There are ways to make this simpler for the user, for example giving them an icon like "click here for peoplesoft" which launches a separate browser for them to use. That browser, without them knowing it, would be running in something like Neocleus or VMware View, and would be carefully pruned/controlled so as to limit unnecessary plugins. You can also limit that browser to only your peoplesoft service so that they cant type in any ol' URL.

jeff


On Jun 2, 2009, at 5:07 PM, William Forte wrote:

I don't suppose you do code reviews on all the routing equipment between your host and the servers either, or on the encryption algorithms utilized, or in the implementation of the protocols being utilized, or the OS kernel. I doubt you check the wire between your keyboard and computer tower for hardware keyloggers everytime you sit down. If you worry about every single link in the chain then you're going to go insane, but there are a few layers of security that you can generally utilize to mitigate those risks.

Firstly, you probably shouldn't be putting sensitive or confidential information into Google's system at all since you just don't know what level of access or utilization to that information their employees or processes are going to have. You can PGP sign all your emails to remove risk on sensitive communications, treating the entire email exchange as transit through a non-trusted system. No matter what you can't personally engineer or review every piece of code involved in the exchange but there are some risks that are stupid risks to take and there are some levels that are considered an acceptable risk and just need to be evaluated.

Respectfully,
William Forte
Information Security Specialist
Information Security Office - University of Rhode Island


On Tue, 2 Jun 2009 16:49:57 -0400, James Moore <jhmiso () RIT EDU> wrote:










Sorry for the cross-post.  I posted this to the
REN-ISAC discussion list. I only got one response, and that was asking if anyone had responded to me off-list. That is when I thought that maybe the question needed more visibility. It has to do with Browser security,
and plugins, helper objects, controls, and widgets.





I accidentally logged into my
iGoogle page that I normally reserve for home.  I meant to log in to
Gmail, to check my alerts for form spam on campus.





But I got to wondering about the way
that I was using iGoogle.  It is very handy at organizing
information.  But I don't know how to code review its widgets.  Then
I was forced to admit to myself, that I use Firefox plug-ins that I don't do
code reviews on either.  I tend to manage risk by using reputation,
recommendations (often from people that I don’t know), and popularity/number
of downloads.





I was wondering if anyone had a more
quantitative process for managing risk in these areas. The browser is at
the crossroads of so much sensitive data.  Certifying or controlling
extensions seems to be prudent.  At the same time, I haven’t found
many tools  that inventory or analyze  plug-ins, accelerators, browser
helper objects,etc.  And the effectiveness of CWSSandbox and Norman
Sandbox on these types of objects is not known.





Then I wondered if  anyone had
reduced a more quantitative risk management process to layman's terms (i.e.
Policy & End users’ guide to what you need to know about browser
plugins.).





I am also looking for a  cost/benefit
analysis of using browser plug-ins, accelerators, browser helper objects, iGoogle
widgets, etc





Thanks,


Jim


- - - -

Jim Moore, CISSP, IAM

Information Security  Officer

Rochester Institute of Technology

151 Lomb Memorial Drive

Rochester ,  NY   14623-5603

(585)  475-5406 (office)

(585) 255-0809 (Cell - Incident Reporting & Emergencies)

(585) 475-7920 (fax)





If you consciously try to thwart opponents, you are already late. Miyamoto Musashi, Japanese philosopher/samurai, 1645




"If we do not,
on a national scale, attack organized criminals with weapons and techniques as effective as their own, they will destroy us." Robert F. Kennedy, 1960



Confidentiality
Notice :
Do the right thing.  If this has the words "Confidential" or
"Private" in the subject line, or similar  language in the email body,
or as a label on any attachment, then think.  Do you know me?  Did
you expect to receive this?  Do you recognize and work with the other
addressees?  If not, then you probably received this in error.
Please, be respectful and courteous, and delete it immediately. Please,
don't forward it to anyone.



Now, wasn't that simple. Just, if you had made an error in a sensitive
email, and I received it, what would you want me to do with it?



















Attachment: smime.p7s
Description:


Current thread: