Archive for February, 2008

America’s Next “Security Idol”

February 7th, 2008 1 comment

If you haven’t got enough of Nir Zuk talking, how about his gangsta rap?

I present you with "Security Idol" featuring contestants: Junne Ipper, Chuck Point and Paolo Alto.

Personally, I think Paula’s kinda hot in this video…

Ya gotta love marketing…if you don’t figure it out by the end of the video, this is a viral effort by Palo Alto Networks.  Funny.

If you’ve got scripting disabled, here’s the link to the video.

Categories: General Rants & Raves Tags:

Security Today == Shooting Arrows Through Sunroofs of Cars?

February 7th, 2008 14 comments

In this Dark Reading post, Peter Tippett, described as the inventor of what is now Norton Anti-virus, suggests that the bulk of InfoSec practices are "…outmoded or outdated concepts that don’t apply to today’s computing

As I read through this piece, I found myself flip-flopping between violent agreement and incredulous eye-rolling from one paragraph to the next, caused somewhat by the overuse of hyperbole in some of his analogies.  This was disappointing, but overall, I enjoyed the piece.

Let’s take a look at Peter’s comments:

For example, today’s security industry focuses way too much time
on vulnerability research, testing, and patching, Tippett suggested.
"Only 3 percent of the vulnerabilities that are discovered are ever
exploited," he said. "Yet there is huge amount of attention given to
vulnerability disclosure, patch management, and so forth."

I’d agree that the "industry" certainly focuses their efforts on these activities, but that’s exactly the mission of the "industry" that he helped create.  We, as consumers of security kit, have perpetuated a supply-driven demand security economy.

There’s a huge amount of attention paid to vulnerabilities, patching and prevention that doesn’t prevent because at this point, that’s all we’ve got.  Until we start focusing on the the root cause rather than the symptoms, this is a cycle we won’t break.  See my post titled "Sacred Cows, Meatloaf, and Solving the Wrong Problems" for an example of what I mean.

Tippett compared vulnerability research with automobile safety
research. "If I sat up in a window of a building, I might find that I
could shoot an arrow through the sunroof of a Ford and kill the
driver," he said. "It isn’t very likely, but it’s possible.

"If I disclose that vulnerability, shouldn’t the automaker put in
some sort of arrow deflection device to patch the problem? And then
other researchers may find similar vulnerabilities in other makes and
models," Tippett continued. "And because it’s potentially fatal to the
driver, I rate it as ‘critical.’ There’s a lot of attention and effort
there, but it isn’t really helping auto safety very much."

What this really means and Peter doesn’t really ever state, is that mitigating vulnerabilities in the absence of threat, impact or probability is a bad thing.  This is why I make such a fuss about managing risk instead of mitigating vulnerabilities.  If there were millions of malicious archers firing arrows through the sunroofs of unsuspecting Ford Escort drivers, then the ‘critical’ rating is relevant given the probability and impact of all those slings and arrows of thine enemies…

Tippett also suggested that many security pros waste time trying
to buy or invent defenses that are 100 percent secure. "If a product
can be cracked, it’s sometimes thrown out and considered useless," he
observed. "But automobile seatbelts only prevent fatalities about 50
percent of the time. Are they worthless? Security products don’t have
to be perfect to be helpful in your defense."

I like his analogy and the point he’s trying to underscore.  What I find in many cases is that the binary evaluation of security efficacy — in products and programs — still exists.  In the absence of measuring the effective impact that something has in effecting one’s risk posture, people revert to a non-gradient scale of 0% or 100% insecure or secure.  Is being "secure" really important or is managing to a level of risk that is acceptable — with or without losses — the really relevant measure of success?   

This concept also applies to security processes, Tippett said.
"There’s a notion out there that if I do certain processes flawlessly,
such as vulnerability patching or updating my antivirus software, that
my organization will be more secure. But studies have shown that there
isn’t necessarily a direct correlation between doing these processes
well and the frequency or infrequency of security incidents.

"You can’t always improve the security of something by doing it
better," Tippett said. "If we made seatbelts out of titanium instead of
nylon, they’d be a lot stronger. But there’s no evidence to suggest
that they’d really help improve passenger safety."

I would like to see these studies.  I think that companies who have rigorous, mature and transparent processes that they execute "flawlessly" may not be more "secure," (a measurement I’d love to see quantified) but are in a much better position to respond and recover when (not if) an event occurs.  Based upon the established corollary that we can’t be 100% "secure" in the first place, we then know we’re going to have incidents.

Being able to recover from them or continue to operate while under duress is more realistic and important in my view.  That’s the point of information survivability.

Security teams need to rethink the way they spend their time,
focusing on efforts that could potentially pay higher security
dividends, Tippett suggested. "For example, only 8 percent of companies
have enabled their routers to do ‘default deny’ on inbound traffic," he
said. "Even fewer do it on outbound traffic. That’s an example of a
simple effort that could pay high dividends if more companies took the
time to do it."

I agree.  Focusing on efforts that eliminate entire classes of problems based upon reducing risk is a more appropriate use of time, money and resources.

Security awareness programs also offer a high
rate of return, Tippett said. "Employee training sometimes gets a bad
rap because it doesn’t alter the behavior of every employee who takes
it," he said. "But if I can reduce the number of security incidents by
30 percent through a $10,000 security awareness program, doesn’t that
make more sense than spending $1 million on an antivirus upgrade that
only reduces incidents by 2 percent?"

Nod.  That was the point of the portfolio evaluation process I gave in my disruptive innovation presentation:

24. Provide Transparency in portfolio effectiveness

I didn’t invent this graph, but it’s one of my favorite ways of
visualizing my investment portfolio by measuring in three dimensions:
business impact, security impact and monetized investment.  All of
these definitions are subjective within your organization (as well as
how you might measure them.)

The Y-axis represents the "security impact" that the solution
provides.  The X-axis represents the "business impact" that the
solution provides while the size of the dot represents the capex/opex
investment made in the solution.

Each of the dots represents a specific solution in the portfolio.

If you have a solution that is a large dot toward the bottom-left of
the graph, one has to question the reason for continued investment
since it provides little in the way of perceived security and business
value with high cost.   On the flipside, if a solution is represented
by a small dot in the upper-right, the bang for the buck is high as is
the impact it has on the organization.

The goal would be to get as many of your investments in your
portfolio from the bottom-left to the top-right with the smallest dots

This transparency and the process by which the portfolio is assessed
is delivered as an output of the strategic innovation framework which
is really comprised of part art and part science.

All in all, a good read from someone who helped create the monster and is now calling it ugly…


The Best Defense is Often, Well, The Best Defense…

February 6th, 2008 No comments

As it goes in football, so it goes in life…

I delivered the closing presentation of the InfoWorld Executive Virtualization Forum in San Francisco on Monday.  The title of my presentation, which I will upload soon, was "
  Addressing Security Concerns in Virtual Environments."

The conference was a good mix of panels and presentations giving some excellent perspective to senior-level managers and executives on virtualization and its impact.

The night before was obviously the Super Bowl and InfoWorld hosted a get-together complete with beer, snacks and a big screen for us to watch the Big Game.  Most of the InfoWorld staff are out of the MA area, so except for a few Giants fans, it was a room packed with Pats fanatics. 

Ultimately, sad, depressed, and shocked Pats fanatics…

So the next day after having to listen to the fantastic keynote from David Reilly, Head of Technology Infrastructure Services, Credit Suisse — an Irishman who grew up in England and now lives in New York — bleat on about "his beloved Giants," I thought it only appropriate that I take one last stab at regaining my pride.

So, when it was my turn to speak, I slipped a borrowed Randy Moss jersey over my silk shirt and took the stage to stares of bewilderment and confusion.

I explained my costume and expressed my disappointment with the team’s performance in one fell swoop:

You may be wondering why I’m up here presenting in my beloved Patriot’s uniform.  Well, this *is* a security presentation, so I thought I could give you no more spectacular illustration of what happens when you fail to execute on a defensive strategy than this (pointing to the jersey.)

Further, I find it completely amusing and apropos to be standing here in a virtualization conference talking about security *last* in the order of things because that’s exactly the problem I want to talk about…

The crowd seemed to enjoy those couple of opening shots and the rest went quite well — I try to make stabs at involving the audience.  I always gauge the success of a show by how many people come up and talk to me at the podium and afterwards.  By all accounts, it rocked since I spent the next 45 minutes talking to the 30+ folks that engaged me between the podium and the beer stand.

Adrian Lane was kind enough to blog about my performance here…

I very much enjoyed the conversation that ensued with some really interesting people.

Looking forward to the next one in NY in the November timeframe.

Hope to see you there.


Travel: Off to Northern California (SF, San Jose, Sunnvale) for the Week…

February 2nd, 2008 4 comments

I’ll be in San Francisco starting Sunday (2/3/08) around midday.  I’ll be in the bar @ the Hotel Nikko watching the Pats annihilate the Giants…

I’ll be presenting at the InfoWorld Executive Virtualization Forum on Monday and then be
visiting various folks throughout the Sunnyvale, San Jose and Los Altos area.  I’m leaving on Thursday (2/7/08.)

Most of my appointments are during the day, so if you’re around and want to grab a drink/dinner,
send me an email (choff [@] or call my call router (it will find me) @ 978.631.0302


Categories: Travel Tags:

OMG, Availability Trumps Security! Oh, the Horror!

February 1st, 2008 3 comments

Michael Farnum’s making me shake my head today in confusion based upon a post wherein he’s shocked that some businesses may favor availability over (ahem) "security." 

Classically we’ve come to know and love (a)vailability as a component of
security — part of the holy triumvirate paired with (c)onfidentiality
and (i)ntegrity — but somehow it’s now incredulous that one of
these concerns can matter more to a business than the others.

If one measures business impact against an asset, are you telling me, Mike, that all three are always equal?  Of course not…

Depending upon what’s important to maintain operations as an on-going concern or what is required as a business decision to be more critical, being available even under degraded service levels may be more important than preserving or enforcing confidentiality and integrity.  To some, it may not.

The reality is that this isn’t an issue of absolutes.  The measured output of the investments in C, I and A aren’t binary — you’re not only either 0% or 100% secure.   There are shades of gray.  Decisions are often made such that one of the elements of C, I and A are deemed more relevant or more important.

Businesses often decide to manage risk by trading off one leg of the stool for another.  You may very end up with a wobbly seat, but there’s a difference between what we see in textbooks and what the realities in the field actually are.

Deal with it.  Sometimes businesses make calculated bets that straddle the fine line of acceptable loss and readiness and decide to invest in certain things versus others.  Banks to this all the time.  Their goal is to be right more often than they are wrong.  They manage their risk.  They generally do it well.  Depending upon the element in question, sometimes A wins.  Sometimes it doesn’t.

Here’s a test.  Go turn off your Internet firewall and tell everyone you’re perfectly secure now.  Will everyone high-five you for a job well done? 

Firewall’s down.  Business stops.  Not for "security’s sake."  Pushed the wrong button…

Compensating controls can help offset effects against C and I, but if an asset or service is not A(vailable) what good is it?  Again, this depends on the type of asset/service and YMMV.  Sometimes C or I win.

Thanks to the glut of security band-aids we’ve thrown at tackling "security" problems these days, availability has become — quite literally — a function of security.  As we see the trend move from managing "security" toward managing "risk," we’ll see more of this heresy common sense appear as mainstream thinking.

Since we can’t seem to express (for the most part) how things like firewalls translate to a healthier bottom line, better productivity or efficiency, it’s no wonder businesses are starting to look to actionable risk management strategies that focuses on operational business impact instead.

Measuring availability (at the macro level or transactionally) is easy.  IT knows how to do this.  Either something is available or it isn’t.  How do you measure confidentiality of integrity as a repeatable metric?

In my comment to Michael (and Kurt Wismer) I note:

It’s funny how allergic you and Wismer are toward the notion that managing risk may mean that “security” (namely C and I) isn’t always the priority.  Basic risk assessment process shows us that in many cases “availability” trumps "security." 

This can’t be a surprise to either of you.

Depending upon your BCP/DR/Incident Response capabilities, the notion of a breakdown in C or I can be overcome by resilience that also has the derivative effect of preserving A.

Risk Management != Security. 

However, good Security helps to reinforce and enforce those things which lend themselves toward making better decisions on how to manage risk.

What’s so hard to understand about that?

Sounds perfectly reasonable to me.

Security’s in the eye of the beholder.  Stop sticking your thumb in yours 😉

Speaking of which Twitter’s down. Damn!  Unavailability strikes again!