Archive

Archive for November, 2007

Security and Disruptive Innovation Part IV: Embracing Disruptive Innovation by Mapping to a Strategic Innovation Framework

November 29th, 2007 4 comments

This is the last of the series on the topic of "Security and Disruptive Innovation."

In Part I we talked about the definition of innovation, cited some examples of general technology innovation/disruption, discussed technology taxonomies and lifecycles and what initiatives and technologies CIO’s are investing in.

In Parts II and III we started to drill down and highlight some very specific disruptive technologies that were impacting Information Security.

In this last part, we
will explore how to take these and future examples of emerging
disruptive innovation and map them to a framework which will allow you
to begin embracing them rather that reacting to disruptive innovation after the fact.

21. So How Can we embrace disruptive technology?
Isd2007028
Most folks in an InfoSec role find themselves overwhelmed juggling the day-to-day operational requirements of the job against the onslaught of evolving technology, business, culture, and economic "progress"  thrown their way.

In most cases this means that they’re rather busy mitigating the latest threats and remediating vulnerabilities in a tactical fashion and find it difficult to think strategically and across the horizon.

What’s missing in many cases is the element of business impact and how in conjunction with those threats and vulnerabilities, the resultant impact should drive the decision on what to focus on and how to prioritize actions by whether they actually matter to your most important assets.

Rather than managing threats and vulnerabilities without context and just deploy more technology blindly, we need to find a way to better manage risk.

We’ll talk about getting closer to assessing and managing risk in a short while, but if we look at what entails managing threats and vulnerabilities as described above, we usually end up in a discussion focused on technology.  Accepting this common practice today, we need a way to effectively leverage our investment in that technology to get the best bang for our buck.

That means we need to actively invest in and manage a strategic security portfolio — like an investor might buy/sell stocks.  Some items you identify and invest in for the short term and others for the long term.  Accordingly, the taxonomy of those investments would also align to the "foundational, commoditizing, distinguished" model previously discussed so that the diversity of the solutions sets can be associated, timed and managed across the continuum of investment.

This means that we need to understand how the intersection of technology, business, culture and economics intersect to affect the behavior of adopters of disruptive innovation so we can understand where, when, how and if to invest.

If this is done rationally, we will be able to demonstrate how a formalized innovation lifecycle management process delivers transparency and provides a RROI (reduction of risk on investment) over the life of the investment strategy. 

It means we will have a much more leveraged ability to proactively invest in the necessary people, process and technology ahead of the mainstream emergence of the disruptor by building a business case to do so.

Let’s see how we can do that…

22. Understand Technology Adoption Lifecycle
Isd2007029

This model is what we use to map the classical adoption cycle of disruptive innovation/technology and align it to a formalized strategic innovation lifecycle management process.

If you look at the model on the top/right, it shows how innovators initially adopt "bleeding edge" technologies/products which through uptake ultimately drive early adopters to pay attention.

It’s at this point that within the strategic innovation framework that we identify and prioritize investment in these technologies as they begin to evolve and mature.  As business opportunities avail themselves and these identified and screened disruptive technologies are vetted, certain of them are incubated and seeded as they become an emerging solution which adds value and merits further investment.

As they mature and "cross the chasm" then the early majority begins to adopt them and these technologies become part of the portfolio development process.  Some of these solutions will, over time, go away due to natural product and market behaviors, while others go through the entire area under the curve and are managed accordingly.

Pairing the appetite of the "consumer" against the maturity of the product/technology is a really important point.  Constantly reassessing the value brought to the mat by the solution and whether a better, faster, cheaper mousetrap may be present already on your radar is critical.

This isn’t rocket science, but it does take discipline and a formal process.  Understanding how the dynamics of culture, economy, technology and business are changing will only make your decisions more informed and accurate and your investments more appropriately aligned to the business needs.

23. Manage Your Innovation Pipeline
Isd2007030

This slide is another example of the various mechanisms of managing your innovation pipeline.  It is a representation of how one might classify and describe the maturation of a technology over time as it matures into a portfolio solution:

     * Sensing
     * Screening
     * Developing
     * Commercializing

In a non-commerical setting, the last stage might be described as "blessed" or something along those lines. 

The inputs to this pipeline as just as important as the outputs; taking cues from customers, internal and external market elements is critical for a rounded decision fabric.  This is where that intersection of forces comes into play again.  Looking at all the elements and evaluating your efforts, the portfolio and the business needs formally yields a really interesting by-product: Transparency… 

24. Provide Transparency in portfolio effectiveness
Isd2007031_2

I didn’t invent this graph, but it’s one of my favorite ways of visualizing my investment portfolio by measuring in three dimensions: business impact, security impact and monetized investment.  All of these definitions are subjective within your organization (as well as how you might measure them.)

The Y-axis represents the "security impact" that the solution provides.  The X-axis represents the "business impact" that the  solution provides while the size of the dot represents the capex/opex investment made in the solution.

Each of the dots represents a specific solution in the portfolio.

If you have a solution that is a large dot toward the bottom-left of the graph, one has to question the reason for continued investment since it provides little in the way of perceived security and business value with high cost.   On the flipside, if a solution is represented by a small dot in the upper-right, the bang for the buck is high as is the impact it has on the organization.

The goal would be to get as many of your investments in your portfolio from the bottom-left to the top-right with the smallest dots possible.

This transparency and the process by which the portfolio is assessed is delivered as an output of the strategic innovation framework which is really comprised of part art and part science.

25. Balancing Art and Science
Isd2007032

Andy Jaquith, champion of all things measured, who is now at Yankee but previously at security consultancy @Stake, wrote a very interesting paper that suggested that we might learn quite a bit about managing a security portfolio from the investment community on Wall Street.

Andy suggested, as I alluded to above that, this portfolio management concept — while not exactly aligned — is indeed as much art as it is science and elegantly suggested that using a framework to define a security strategy over time is enabled by a mature process:

"While the analogy is imperfect, security managers should be able to use the tools of unique and systematic management to create more-balanced security strategies."

I couldn’t agree more 😉

26. How Are you doing?

Isd2007033

If your CEO/CIO/CFO came to you today and put in front of you this list of disruptive innovation/technology and asked how these might impact your existing security strategy and what you were doing about it, what would your answer be?

Again, many of the security practitioners I have spoken to can articulate in some form how their existing technology investments might be able to absorb some impact this disruption delivers, but many have no formalized process to describe why or how.

Luck?  Serendipity?  Good choices?  Common sense?

Unfortunately, without a formalized process that provides the transparency described above it becomes very difficult to credibly demonstrate that the appropriate amount of long term strategic planning has been provided for and will likely cause angst and concern in the next budget cycle when monies for new technology is asked for.

27. Ranum for President
Isd2007034
At a minimum, what the business wants to know is whether, given the investment made, they are more or less at risk than they were before the investment was made (see here for what they really want to know.)

That’s a heady question and without transparency and process, one most folks would — without relying purely on instinct — have a difficult time answering.  "I guess" doesn’t count.

To make matters worse, people often confuse being "secure" with being less at risk, and I’m not sure that’s always a good thing.  You can be very secure, but unfortunately make the ability for the business to conduct business very difficult.  This elevates risk, which is bad. 

What we really seek to do is balance information sharing with the need to manage risk to an acceptable level.  So when folks ask if the future will be more "secure," I love to refer them to Marcus Ranum’s quote in the slide above: "…it will be just as insecure as it possibly can, while still continuing to function.  Just like it is today."

What this really means is that if we’re doing our job in the world of security, we’ll use the lens that a strategic innovation framework provides and pair it with the needs of the business to deliver a "security supply chain" that is just-in-time and with a level — no less and no more — than what is needed to manage risk to an acceptable level.

I do hope that this presentation gives you some ideas as to how you might take a longer term approach to delivering a strategic service even in the face of disruptive innovation/technology.

/Hoff

Categories: Disruptive Innovation Tags:

Take5 (Episode #7) – Five Questions for Nir Zuk, Founder & CTO Palo Alto Networks

November 26th, 2007 7 comments

It’s been a while since I’ve done a Take5 and this seventh episode interviews Nir Zuk, Founder & CTO of up-start "next-generation firewall" company Palo Alto Networks

There’s been quite a bit of hubbub lately about PAN and I thought I’d see what all the frothing was about.  I reached out to Nir and sent him a couple of questions via email which he was kind enough to answer.  PAN is sending me a box to play with so we’ll see how well it holds up on the Rack.  I’m interested in seeing how this approach addresses the current and the next generation network security concerns.

Despite my soapbox antics regarding technology in the security space, having spent the last two years at a network security startup put me at the cutting-edge of some of the most unique security hardware and software in the business and the PAN solution has some very interesting technology and some very interesting people at its core.

If you’ve used market-leading security kit in your day, you’ve probably appreciated some of Nir’s handywork:

First a little background on the victim:


Nirzuk_2
Nir Zuk brings a wealth of network security expertise and industry
experience to Palo Alto Networks. 

Prior to co-founding Palo Alto
Networks, Nir was CTO at NetScreen Technologies, which was acquired by
Juniper Networks in 2004.

Prior to NetScreen, Nir was co-founder and
CTO at OneSecure, a pioneer in intrusion prevention and detection
appliances.  Nir was also a principal engineer at Check Point Software
Technologies and was one of the developers of stateful inspection
technology.


Just to reiterate the Take5 ground-rules: I have zero interest in any
of the companies who are represented by the folks I interview, except
for curiosity.  I send the questions via email and what I get back, I post.  There are no clarifying attempts at messaging or do-overs.  It’s sort of like live radio, but without sound…

Questions:

1) Your background in the security space is well known and as we take a
look out at the security industry and the breadth of technologies and
products balanced against the needs of the enterprise and service
providers, why did you choose to build another firewall product?

Don't we have a mature set of competitors in this space?  What need is
Palo Alto Networks fulfilling?  Isn't this just UTM?


The reason I have decided to build a new firewall product is quite
similar to the reasons Check Point (one of my previous employers)
decided to build a new firewall product back in the early 90's when
people where using packet filters embedded in routers - that reason
being that existing firewalls are ineffective. Throughout the years,
application developers have learnt how to bypass existing firewalls
using various techniques such as port hopping, tunneling and encryption.
Retrofitting existing firewalls, which use ports to classify traffic,
turned out to be impossible hence a new product had to be developed from
the ground up.

2) As consolidation of security technologies into less boxes continues
to heat up, vendors in the security space add more and more
functionality to their appliances so as not to be replaced as the
box-sprinkling madness continues.  Who do you see as a competitive
threat and who do you see your box replacing/consolidating in the long
term?


I think that a more important trend in network security today is the
move from port-centric to application-centric classification
technologies. This will make most of the existing products obsolete,
similar to the way stateful inspection has made its predecessors
disappear from the world... As for device consolidation, I think that
existing firewall architectures are too old to support real
consolidation, which today is limited to bolting multiple segregated
products on the same device with minimal integration. A new
architecture, which allows multiple network security technologies to
share the same engines, has to emerge before real consolidation happens.
The Palo Alto Networks PA-4000 series is, I believe, the first device to
offer this kind of architecture.

3) The PA-4000 Series uses some really cutting-edge technologies, can
you tell us more about some of them and how the appliance is
differentiated from multi-core x86 based COTS appliances? Why did you go
down the proprietary hardware route instead of just using standard Intel
reference designs and focus on software?

Intel CPUs are very good at crunching numbers, running Excel
spreadsheets and for playing high-end 3D games. They are not so good at
handling packets. For example, the newest quad core Intel CPU can
handle, maybe, 1,500,000 packets per second which amounts to about 1
Gbps with small packets. A single network processor, such as the one of
many that we have in the PA-4000 series, can handle 10 times that -
15,000,000 packets per second. Vendors that claim 10 Gbps throughput
with Intel CPUs, do so with large packet sizes which do not represent
the real world.


4) Your technology focuses on providing extreme levels of application
granularity to be able to identify and control the use of specific applications.   

Application specificity is important as more and more applications use well
known ports (such as port 80) encryption or other methods to obfuscate
themselves to bypass firewalls.  Is this going deep enough?  Don't you need
to inspect and enact dispositions at the content level; after all, it's the
information that's being transmitted that is important.


Inspection needs to happen at two levels. The first one is used to
identify the application. This, usually, does not require going into the
information that's being transmitted but rather merely looking at the
enclosing protocol. Once the application is identified, it needs to be
controlled and secured, both of which require much deeper inspection
into the information itself. Note that simply blocking the application
is not enough - applications need to be controlled - some are always
allowed, some are always blocked but most require granular policy. The
PA-4000 products perform both inspections, on two different
purpose-built hardware engines.

5)  You've architected the PA-4000 Series to depend upon signatures and
you don't use behavioral analysis or behavioral anomaly detection in the
decision fabric to determine how to enact a disposition.  Given the
noise associated with poorly constructed expressions based upon
signatures in products like IDS/IPS systems that don't use context as a
decision point, are you losing anything by relying just on signatures?


The PA-4000 is not limited to signature-based classification of
applications. It is using other techniques as well. As for
false-positive issues, these are usually not associated with traffic
classification but rather with attack detection. Generally, traffic
classification is a very deterministic process that does not suffer from
false positives. As for the IDS/IPS functionality in the PA-4000 product
line, it is providing full context for the IDS/IPS signatures for better
accuracy but the most important reason as to why the PA-4000 products
have better accuracy is because Palo Alto Networks is not a pure IPS
vendor and therefore does not need to play the "who has more signatures"
game which leads to competing products having thousands of useless
signatures that only create false positives.

BONUS QUESTION:

6)  The current version of the software really positions your solution as
a client-facing, forward proxy that inspects outbound traffic from an end-user
perspective.   

Given this positioning which one would imagine is done mostly at a "perimeter"
choke point, can you elaborate on adding features like DLP or NAC?  Also, if
you're at the "perimeter" what about reverse proxy functionality to inspect
inbound traffic to servers on a DMZ?


The current shipping version of PAN-OS provides NAC-like functionality
with seamless integration with Active Directory and domain controllers.
DLP is not currently a function that our product provides even though
the product architecture does not preclude it. We are evaluating adding
reverse proxy functionality in one of our upcoming software releases.

Categories: Take5 Tags:

Answering A Very Difficult Value Question Regarding Information Security

November 24th, 2007 12 comments

MoremoneyEarlier this week I was in Nice, France speaking on the topic of the impact that the consumerization of IT has on security and vice versa.

We had a really diverse set of speakers and customers in attendance.

When you can pool the input and output from very large financial institutions to small law firms against the presentations from business innovation experts, security folk, workforce futurists, industry analysts and practitioners, you’re bound to have some really interesting conversation.

One of the attendees really capped off the first day’s discussion for me whilst at the bar by asking a seemingly innocuous (but completely flammable) question regarding the value that Information Security brings to the table against its ability to provide service and not stifle agility, innovation and general business practice.

This really smart person leads the innovation efforts at a very large financial institution in the UK and was quite frankly fed up with the "No Department" (InfoSec group) at his company.  He was rightfully sick of the strong-arming speedbumps that simply got in the way and cost money.

The overtly simplified question he posited was this:

Why can’t you InfoSec folks quite simply come to your constituent customers — the business — and tell them that your efforts will make me x% more or less profitable?

In his organization — which is really good at making decisions based
upon risk — he maintained that every business decision had assessed against it an
acceptable loss figure.  Sometimes those figures totaled in the
billions.

He suggested then that things like firewalls, IPS’s, AV,
etc. had a near zero-sum impact when measured in cost against these
acceptable losses.  Instead of the old axiom regarding not spending $100,000 to protect a $1,000 asset, he was actually arguing about not spending $100,000 to offset an acceptable loss of $1,000,000,000…

Interesting. 

I smiled as I tried to rationalize why I thought for the most part, nobody I knew could easily demonstrate the answer to his question.  Right, wrong or indifferent, I agreed that this was really a fundamentally crappy topic to bring up without something stronger than wine. 😉

Speedbumps
It turned into quite an interesting conversation, during which I often found myself putting on various hats (architecture, security, operations, risk management) in an attempt to explain — but not justify — the status quo.

I demonstrated what I thought were some interesting counter-questions but for the most part found it increasingly uncomfortable each time we ended up back at his initial question.   The more complex the answers, the more divergent from the concept he was focused on became.

Imagine if you were the CSO and were being asked this question by your CIO/CFO/CEO as the basis for the on-going funding of your organization: "We can comfortably sustain losses in the hundreds of millions.  Why should I invest in security when you can’t demonstrate that you enable my business to achieve its business goals in a way which can make us more profitable or offset my acceptable losses?"

It’s why businesses exercise any option to swerve around the speedbumps IT/Security are perceived as being.

Categories: General Rants & Raves Tags:

Security and Disruptive Innovation Part III: Examples of Disruptive Innovation/Technology in the Security Space

November 24th, 2007 2 comments

Continuing on from my last post titled Security and Disruptive Innovation Part II: Examples of Disruptive Innovation/Technology in the Security Space we’re going to finish up the tour of some security-specific
examples reflecting upon security practices,
movements and methodologies and how disruptors, market pressures and
technology are impacting what we do and how. 

16.  Software as a Service (SaaS)
Isd2007023
SaaS
is a really interesting disruptive element to the traditional approach of
deploying applications and services; so much so that in many cases, the
business has the potential to realize an opportunity to sidestep IT and
Security altogether by being able to spin up a new offering without involving either group. 

There’s no complex infrastructure to buy and install, no obstruction to the business process.  Point, click,
deploy.  The rationalization of reduced time to market, competitive
advantage and low costs are very, very sexy concepts.

On the one hand, we have the agility, flexibility and innovation that SaaS
brings but we also need to recognize how SaaS intersects with the
lifecycle management of applications.  The natural commoditization of software
functionality that is yielded as a by-product of the "webification" of
many of the older applications make SaaS even more attractive as it offers a more cost-effective alternative.  Take WebEx, Microsoft Live, Salesforce.com and Google Apps as examples.

There are a number of other interesting collision spaces that impact information security.  Besides issues surrounding the application of general security controls in a hosted model, since the application and data are hosted offsite, understanding how and where data is stored, backed up, consumed, re-used and secured throughout is very important.  As security practitioners, we lose quite a bit of visibility from an operational perspective in the SaaS model.

Furthermore, one of the most important issues surrounding data security and SaaS is the issue of portability; can you take the data and transpose its use from one service to another?  Who owns it?  What format is it in?  If the investment wager in service from a SaaS company does not pay off, what happens to the information?

SaaS is one of the elements in combination with virtualization and utility/grid computing that will have a profound impact on the way in which we secure our assets.  See the section on next generation centers of data and information centricity below.

17.  Virtualization
Isd2007024
Virtualization is a game-changing technology enabler that provides economic, operational and resilience benefits to the business.  The innovation delivered by this disruptor are plainly visible.

Virtualization in servers today allow us to realize the first of many foundational building blocks of future operating system architectures and next generation computing platforms such as the promises offered by grid and utility computing models.

While many of the technology advancements related to these "sidelined futures" have been in the works for many years, most have failed to grasp mainstream adoption because despite being technically feasible, they were not economically viable.  This is changing.  Grid and utility computing is starting to really take hold thanks to low cost compute stacks, high-speed I/O, and distributed processing/virtualization capabilities.

Virtualization is not constrained to simply the physical consolidation of server iron; it extends to all elements of the computing experience; desktops, data, networks, applications, storage, provisioning, deployment and security.

It’s very clear that like most emerging technologies, we are in the position of playing catch-up with securing the utility that the virtualization delivers.  We’re seeing wholesale shifts in the operationalization of IT resources and it it will continue to radically impact the way in which we think about how to secure the assets most important to us.

In many cases, those who were primarily responsible for the visibility and security of information across well-defined boundaries of trust, classification, and distribution, will find themselves in need of new methods, tools and skillsets when virtualization is adopted in their enterprise.

  To generally argue whether virtualization provides "more" or "less" security as compared to non-virtualized environments is an interesting debate, but one that offers little in the way of relevant assistance to those faced with securing virtualized environments today. 

Any emerging technology yields new attack surfaces, exposes vulnerabilities and provides new opportunities related to managing risk when threats arise.  However, how "more" or "less" secure one is when implementing virtualization is just as subjective a measurement which is dependent upon business impact, how one provisions, administers, and deploys solutions and how ultimately applies security controls to the environment.

Realistically, if your security is not up to par in non-virtualized, physically-isolated infrastructure, you will be comforted by the lack of change when deploying virtualization; it will be equally as good…

There are numerous resources available now discussing the "security" things we should think about when deploying virtualization.  You can find many on my blog here.

18.  De-/Re-Perimeterization
Isd2007025
This topic is near and dear to my heart and inspires some very passionate discussion when raised amongst our community. 

Some of the reasons for heated commentary come from the poor marketing of the underlying message as well as the name of the concept. 

Whether you call it de-perimeterization, re-perimeterization or radical externalization, this concept argues that the way in which security is practiced today is outdated, outmoded and requires a new model that banishes the notion that the inside and outside of our companies are in any way distinguishable today and thus our existing solutions are ineffective to defend them.

De-/Re-perimeterization does not mean that you should scrap your security program or controls in lieu of a new-fangled dogma and set of technology.  It doesn’t mean that one should throw away the firewalls so abundantly prevalent at the "perimeter" borders of the network. 

It does, however, suggest you should redefine the notion of the perimeter.  The perimeter, despite its many holes, is like a colander — filtering out the big chunks at the edge.  However, the problem doesn’t lie with an arbitrary line in the sand, it permeates the computing paradigm and access modalities we’ve adopted to provide access to our most important assets.

Trying to draw a "perimeter" box around an amorphous and dynamic abstraction of our intellectual property in any form is a losing proposition.

However, the perimeter isn’t disappearing.  In fact, I maintain that it’s multiplying, but the diameter is collapsing.

Every element in the network is becoming its own "micro-perimeter" and we have to think about how we can manage and secure hundreds or thousands of these micro-perimeters by re-thinking how we focus on solving the problems we face today and what those problems actually are without being held hostage by vendors who constantly push the equivalent of vinyl siding when the foundations of our houses are silently rotting away in the name of "defense in depth."

"Defense in depth" has really become "defense in width."  As we deploy more and more security "solutions" all wishing to be in-line with one another and do not interoperate, intercommunicate or integrate, we’re not actually solving the problem, we’re treating the symptoms.

We really need endpoints that can self-survive in assuredly hostile environments using mutual authentication and encryption of data which can self-describe the nature of the security controls needed to protect it.  This is the notion of information survivability versus information security.

This is very much about driving progress through pressure on
developers and vendors to produce more secure operating systems,
applications and protocols.  It will require — in the long term —
wholesale architectural changes to our infrastructure and architecture. 

The reality is that these changes are arriving in the form of things like virtualization, SaaS, and even the adoption of consumer technologies as they force us to examine what, how and why we do what we do.

Progress is being made and will require continued effort to realize the benefits that are to come.

19.  Information Centricity
Isd2007026
Building off the themes of SaaS and the de-/re-perimeterization concepts, the notion of what and how we protect our information really comes to light in the topic of information centricity.

You may have heard the term "data-centric" security, but I despise this term because quite frankly, most individuals and companies are overloaded; we’re data rich and information poor.

What we need to do is allow ourselves not to be overwhelmed by the sheer mountains of "data" but rather determine what "information" matters to us most and organize our efforts around protecting it in context.

Today we have networks which cannot provide context and hosts that cannot be trusted to report their status so it’s no wonder we’re in a heapful of trouble. 

We need to look at the tenets described in the de-/re-perimeterization topics above and recognize the wisdom of the notion that "…access to data should be controlled by the security attributes of the data itself." 

If we think of controlling the flow or "routing" of information by putting in place classification systems that work (content in context…) we have a fighting chance of ensuring that the right data gets to only the right people at the right time.

Without blurring the discussion with the taglines of ERM/DRM, controlling information flow and becoming information centric rather than host or network centric is critically important, especially when you consider the fact that your data is not where you think it is…

20.  Next Generation Centers of Data
Isd2007027
This concept is clear and concise.

Today the notion of a "data center" is a place where servers go to die.

A "center of data" on the other hand, is an abstraction that points to anywhere where data is created, processed, stored, secured and consumed.  That doesn’t mean a monolithic building with a keypad out front and a chunk of A/C and battery backup.

In short, thanks to innovation such as virtualization, grid/utility services, SaaS, de-/re-perimeterization and the consumerization of IT, can you honestly tell me that you know where your data is and why?  No.

The next generation centers of data really become the steam that feeds the "data pumps" that power information flow.  While in one sense even if the compute stacks may become physically consolidated, the processing and information flow become more distributed.

Processing architectures and operational realities are starting to provide radically different approaches to the traditional data center.  Take Sun’s Project Blackbox or Google’s distributed processing clusters, for example.  Combined with grid/utility computing models, instead of fixed resource affinity, one looks at pooled sets of resources and distributed computing capacity which are not constrained by the physical brick and mortar wallspaces of today.

If applications, information, processes, storage, backup, and presentation are all distributed across these pools of resources, how can the security of today provide what we need to ensure even the very basic constructs of confidentiality, integrity and availability?

Next we will explore how to take these and future examples of emerging disruptive innovation and map them to a framework which will allow you to begin embracing them rather that reacting to them after the fact.

Categories: Disruptive Innovation Tags:

Travel: Off to Scotland, UK and France Until 11/21

November 16th, 2007 No comments

Stpaul
I’ll be off today to Scotland, the UK and ultimately France for almost a week. 

There’s a really interesting conference taking place at our center in St. Paul de Vence (France) regarding the Consumerization of IT. 

You’ll recall that this is one of the topics covered in my "Embracing Disruptive Innovation" deck.

In fact, that’s what I’m going to be speaking about; the ramifications and implications that the consumerization of IT is having on enterprise security.

Back the night before Turkey Day so that missus isn’t too wound up! 😉

Have a great Thanksgiving, everyone.

/Hoff

Categories: Travel Tags:

BeanSec! Wednesday, November 21st – 6PM to ?

November 16th, 2007 1 comment

Beansec3_2
This month’s BeanSec! will be even more informal than usual given its proximity to Turkey Day.  We didn’t want to cancel or move it, so those of you who want to show are up welcome to do so.

It will likely be a light turn-out.

Please be aware that I will not be there and as such, food and drinks will not be paid for as the usually are.


Yo!  BeanSec! is once again upon us.  Wednesday, November 21st, 2007.

BeanSec! is an informal meetup of information security
professionals, researchers and academics in the Greater Boston area
that meets the third Wednesday of each month. 

I say again, BeanSec! is hosted the third Wednesday of every month.  Add it to your calendar.

Come get your grub on.  Lots of good people show up.  Really.

Unlike other meetings, you will not be expected to pay dues, “join
up”, present a zero-day exploit, or defend your dissertation to attend.
Map to the Enormous Room in Cambridge.

Enormous Room: 567 Mass Ave, Cambridge 02139.  Look for the Elephant
on the left door next to the Central Kitchen entrance.  Come upstairs.
We sit on the left hand side…

Don’t worry about being "late" because most people just show up when
they can.  6:30 is a good time to aim for.  We’ll try and save you a
seat.  There is a parking garage across the street and 1 block down or
you can try the streets (or take the T)

In case you’re wondering, we’re getting about 30-40 people on
average per BeanSec!  Weld, 0Day and I have been at this for just over
a year and without actually *doing* anything, it’s turned out swell.

We’ve had some really interesting people of note attend lately (I’m
not going to tell you who…you’ll just have to come and find out.)  At
around 9:00pm or so, the DJ shows up…as do the rather nice looking
people from the Cambridge area, so if that’s your scene, you can geek
out first and then get your thang on.

The food selection is basically high-end finger-food appetizers and
the drinks are really good; an attentive staff and eclectic clientèle
make the joint fun for people watching.  I’ll generally annoy you into
participating somehow, even if it’s just fetching napkins. 😉

See you there.

/Hoff

Categories: BeanSec! Tags:

Hypervisors Are Becoming a Commodity…Virtualization Is a Feature?

November 14th, 2007 No comments

Marketfeature2 A couple of weeks ago I penned a blog entry titled "The Battle for the HyperVisor Heats Up"
in which I highlighted an announcement from Phoenix Technologies
detailing their entry into the virtualization space with their
BIOS-enabled VMM/Hypervisor offering called HyperCore.

This drew immediate parallels (no pun intended) to VMware and Xen’s plans to embed virtualization capabilities into hardware.

The marketing continues this week with interesting announcements from Microsoft, Oracle and VMware:

  1. VMware offers VMware Server 2 as a free virtualization product to do battle against…
  2. Oracle offering "Oracle VM" for free (with paid support if you
    like) which claims to be 3 times as efficient than VMWare — based on
    Xen.
  3. Microsoft officially re-badged its server virtualization technology as Hyper-V (nee Veridian)
    detailing both a stand-alone Hyper-V Server as well technology integrated into W2K8 Server.

It seems that everyone and their mother is introducing a virtualization platform and the underpinning of commonality between basic functionality demonstrates how the underlying virtualization enabler — the VMM/Hypervisor — is becoming a commodity.

We are sure to see fatter, thinner, faster, "more secure" or more open Hypervisors, but this will be an area with less and less differentiation.  Table stakes.  Everything’s becoming virtualized, so a VMM/Hypervisor will be the underlying "OS" enabling that transformation.

To illustrate the commoditization trend as well as a rather fractured landscape of strategies, one need only look at the diversity in existing and emerging VMM/Hypervisor solutions.   Virtualization strategies are beginning to revolve around a set of distinct approaches where virtualization is:

  1. Provided for and/or enhanced in hardware (Intel, AMD, Phoenix)
  2. A function of the operating system (Linux, Unix, Microsoft)
  3. Delivered by means of an enabling software layer (nee
    platform) that is deployed across your entire infrastructure (VMware, Oracle)
  4. Integrated into the larger Data Center "Fabric" or Data Center OS (Cisco)
  5. Transformed into a Grid/Utility Computing model for service delivery

The challenge for a customer is making the decision on whom to invest it now.  Given the fact that there is not a widely-adopted common format for VM standardization, the choice today of a virtualization vendor (or vendors) could profoundly affect one’s business in the future since we’re talking about a fundamental shift in how your "centers of data" manifest.

What is so very interesting is that if we accept virtualization as a feature defined as an abstracted platform isolating software from hardware then the next major shift is the extensibility, manageability and flexibility of the solution offering as well as how partnerships knit out between the "platform" providers and the purveyors of toolsets.

It’s clear that VMware’s lead in the virtualization market is right inline with how I described the need for differentiation and extensibility both internally and via partnerships. 

VMotion is a classic example; it’s clearly an internally-generated killer app. that the other players do not currently have and really speaks to being able to integrate virtualization as a "feature" into the combined fabric of the data center.  Binding networking, storage, computing together is critical.  VMware has a slew of partnerships (and potential acquisitions) that enable even greater utility from their products.

Cisco has already invested in VMware and a recent demo I got of Cisco’s VFrame solution shows they are serious about being able to design, provision, deploy, secure and manage virtualized infrastructure up and down the stack, including servers, networking, storage, business process and logic.

In the next 12 months or so, you’ll be able to buy a Dell or HP server using Intel or AMD virtualization-enabled chipsets pre-loaded with multiple VMM/Hypervisors in either flash or BIOS.  How you manage, integrate and secure it with the rest of your infrastructure — well, that’s the fun part, isn’t it?

I’ll bet we’ll see more and more "free" commoditized virtualization platforms with the wallet ding coming from the support and licenses to enable third party feature integration and toolsets.

/Hoff

One Man’s Threats Are Another Man’s Opportunities (Embracing Disruptive Technology)

November 12th, 2007 2 comments

Gatorphone
Last week, Jim Rapoza from the ZD Enterprise’s Emerging Technology blog wrote an article that caught my eye titled "Emerging Security Threats.

I popped on over to get what I suspected would be my weekly fill of Botnets gone wild and other malware-laden horror stories only to be surprised to find that the top emerging security threats were actually many of the same strategic technologies that CIO’s reported to Gartner as those  "…with the
potential for significant impact on the enterprise in the next three
years."  Go figure.

Jim summarized the intent of his post thusly:

Emerging technologies can bring a whole host of benefits, often
improving productivity, changing the way businesses interact and
enhancing the lives of people all over the world.

And whenever a new technology comes out and gets a lot of hype,
there is a lot of enthusiasm about the many benefits and new
capabilities that this technology provides.

But, also without fail, there is one key thing that almost no one ever talks about. What is this hidden factor? It’s security.

Over the years I’ve gone to lots of conferences and seminars
dedicated to emerging technologies, from Web 2.0 to virtualization to
virtual worlds. And the one thing that pretty much never gets covered
(or even mentioned) in these conferences in security.

Of course, this is understandable. New technologies are just
introducing themselves to the world. It’s sort of like a first date.
When you go on a first date, you probably don’t start out talking about
all of your illnesses and insecurities. The same goes for emerging
technologies. Their creators just want to promote their good points.

But for users of these technologies, ignoring the potential security
threats that these emerging technologies introduce can lead to big
problems, including data theft, system compromises and the spread of
malware.

I think that Jim’s analogies are basically good ones; security has been shown historically as an afterthought, but in the context of my last couple of posts, by attempting to draw attention to the disruptive effect these technologies have and their generally under-capitalized security investment in the manner in which he does in effect sensationalizes an already flammable scenario.

The reality-based analog that is suitable for contrast here is the old
cliche: "guns don’t kill people…people kill people."  As corny and over-played as that is, technology
doesn’t cause threats to materialize magically, the poor implementation of
the technology does. 

Rather than work to rationally discuss security in context and consider these disruptive technological innovations as opportunities to leverage, they are ultimately painted here as evil.  This is exactly the sort of "security is a speed bump" persona we need to shed!

Isd2007014Check out the purported horror show of "emerging threats" below and compare them to Gartner’s Top 10 Strategic Technologies for 2008-2011 to the right.   These technologies possess "factors that denote significant impact include a high potential
for disruption to IT or the business, the need for a major dollar
investment, or the risk of being late to adopt"

  1. Ajax
  2. Google Apps
  3. Mobile Devices & Applications
  4. RFID
  5. Rich Internet Applications
  6. RSS
  7. Social Networks
  8. Virtual Worlds
  9. Virtualization
  10. VoIP

How many of either of the Top-Ten lists above are you dealing with today?

Check out the slideshow.  Lovely artwork, but abrasive and vague at best.  Rather than paint a balanced portrait of pros and cons as his introduction alludes to or suggest how these technologies can be deployed securely, we instead get soundbites like this:

VOIP – VOIP systems have greatly broadened the telecom options for
businesses, not only freeing them from traditional phones but making it
possible to easily tie voice into other enterprise applications. But
VOIP systems can be easily tapped by anyone and have become an
attractive target for hackers.

The reality is that any new technology has the potential to allow "bad stuff to happen."  I think we all know that already.  What would be really useful is a way of managing this process.  I think there’s a better way of communicating without relying on fear.

/Hoff

Categories: Disruptive Innovation Tags:

Security and Disruptive Innovation Part II: Examples of Disruptive Innovation/Technology in the Security Space

November 12th, 2007 3 comments

Continuing on from my last post titled Security and Disruptive Innovation Part I: The Setup we’re going to take the general examples of innovative technological industry disruptors in slide 3 and highlight some security-specific examples to bring the point a little closer to home.

In this case, we’re going to reflect upon security practices, movements and methodologies and how disruptors, market pressures and technology are impacting what we do and how.  The point of this is to discuss a framework of how to embrace and manage the process of evaluating emerging technologies and disruption and manage to it proactively.

13.  Examples of Disruptive Innovation in Security

Isd2007020 As we demonstrated previously in slide 3, the impact that disruptors in the right-hand column caused against those who enjoyed market dominance in the left-hand column was profound.  In many cases, they incumbents never saw it coming. 

Some of these shifts were incremental and some were radically game-changing.  Some took quite a while to catch on, while others benefited from the viral "sneezers" (as Seth Godin is fond of saying.)

Here we see a list  on the left featuring established thought leadership, generally observed practices and methodologies and what some might describe as the status quo within the security industry.   

The corresponding list on the right represents emerging disruptive innovation and technology.  Most of you should be familiar with these issues.  To some, they are merely background noise — glacially eroding the landscape while the day-to-day priorities are dispatched —  while to others they represent pressing business concerns and abrasive friction, threatening the manner in which security programs are executed and competing for attention at every turn.

Let’s take a look at each of these samples in more detail; the slides are just talking points, so I’ll add color in the accompanying text.  This will be split into a couple of posts.

14. The Outsourcing of Security

Isd2007021
In my experience, outsourcing in general provokes a visceral response no matter which side of the fence one may choose to sit.  Pro or con, outsourcing of services is a due matter of course in today’s world.

Whether the motivation is taking cost out of the business, focusing on competencies, the transference of risk or improving operational efficiency, if you haven’t felt some impact from the outsourcing movement already, you surely will at some point shortly.

If one starts poking around the notion of outsourcing "security" functions to resources outside of an InfoSec shop’s interal corps, it’s often bound to generate sparks. 

In general, my observations have been that InfoSec staffers become incredibly defensive about the feasibility and perception of security when discussing outsourcing elements of a security program.  Many of these arguments are instinctual and not business-driven but are autonomic and reflexive.  It’s really hard to let go of the fact that the value we purport to provide the business is, in many cases, becoming a feature set of a larger operational machine.

In many cases I have personally witnessed, the arguments against outsourcing security are supported with knee-jerk comments citing "possible exposure," "unacceptable risk," or "regulatory issues" but rarely have any hard data (read: quantifiable metrics) to back them up.  Neither hope or FUD is a very good strategy.

The reality is that in many cases, mature operational functions represent excellent opportunities for outsourcing.  Many of these have capital and operating expenses that can be reduced or altogether eliminated and allow for the "security" team to focus on more important things.

Common examples of outsourced low-hanging fruit security functions today include:

  • Managed firewall
  • Managed Intrusion Detection/Prevention
  • Anti-Spam
  • Vulnerability Assessment/Management
  • Secure Messaging

Combined with operational models such as Software as a Service (SaaS) which we’re going to talk about shortly, we’re even seeing examples of outsourced application and code analysis, complete application outsourcing, etc.

Obviously this all comes down to the type of business you’re in and the risk associated with letting some other party operationalize elements of your business processes, but it’s happening in a big way and will continue to do so.

I’ve personally witnessed and example of Fortune 500 companies dissolving their entire operational administrative and security teams and sell their data center hard assets to a management services company.  This company then leases back the management of the IT and Security operations as a service allowing the security team to act as architects and focus on more pressing relevant business issues instead of firefighting.  They become much more strategic and integrated with the business.

The disruptive argument for outsourcing revolves around addressing the issue of spending time and money paying legions of administrators and security folk to perform tasks which are often times not critical and do not add business value and that can be obtained elsewhere at competent levels of quality (or perhaps higher) that are also faster and cheaper.

How would you take the cost savings/avoidance benefits of outsourcing and describe how you might invest it elsewhere in your security spend to demonstrate better alignment to the business?

15. The Consumerization of IT

Isd2007022
A good number of security professionals are also masterful consumers and collectors of toys of one kind or another.  As aficionados of all things tech, you’ll often find even the most conservative security wonks lining up to buy the latest kit with the newest features on release day.

Rationalizing why we might need to upgrade to a phone with video playback, camera, massive storage, WiFi, web browsing and open API’s is easy: flexibility, agility, efficiency, connectivity…it let’s one do what one wants/needs/likes to do faster, better, easier, and cheaper, right?  At least that’s what we tell our wives 😉

In what can only be described as a case of clinical schizophrenia, the same iPhone-toting CISO might also be the first to rail against the introduction of these new technologies within the enterprise despite the exact claims and justifications being made by the business.

New technology is often introduced into the organization and championed under the same banners of enhanced efficiency, agility or customer experience, and these initiatives are often critical elements that a business invests in so as to secure a competitive business advantage against the competition.

Strangely, the business value for the adoption of many of these consumer-based technologies entering the enterprise (even if it’s merely "good will") is often times ignored and cast aside in the name of "security" with the overriding inflexibility chalked up to "implied" risk, undisclosed (invisible?) vulnerabilities and simply bad "juju" — all grouped under the iron-clad containment of the almighty "security policy."

Now, there are also many very reasonable reasons to suggest that allowing employees to use consumer technologies within the enterprise is a difficult concept: support, confidentiality, privacy, regulatory requirements.  There are valid issues to be dealt with and awareness of the impact by the business of what their decisions to allow this sort of technology to be used is really important.

There are two dirty little secrets that must be accounted for when discussing the consumerization of IT within the enterprise and your business constituents:

  1. It’s not Security’s place, birthright, charter or problem to be the judge, jury and executioner as to what is allowed or not allowed.  It *is* Security’s job to advise the business and allow them to make a (gasp!) business decision on the matter.
  2. They’re doing it anyway and will continue to do so. 

If a technology or innovation allows an employee who actually contributes to the bottom line to do his/her job better, more efficiently, less costly and helps driven revenue that contributes to your budget (read: paycheck) why is this bad thing!? 

If you’re doing your job, the business will take your advice seriously and will make a decision based on fact.  They may decide that despite your advice, the technology or innovation is compelling enough to outweigh the potential risk.  Other times they might not.

Either way, you’ve done your job. 

Remember when WiFi first appeared?  Most enterprises and their IT and Security teams vehemently attempted to prevent its use by policy citing the lack of business need and security concerns.  There were certainly security issues that needed to be solved, but today WiFi has emerged as a disruptive technology that is indispensable as a tool.  If you have remote employees, you are first-row-center observers as to how WiFi as a disruptive innovation has changed the landscape.

Many companies have these enormous virtualized and distributed workforces.   To facilitate such a decentralized model, these companies are beginning to embrace a program that my company calls the "Digital Allowance." 

Digital Allowance provides an annual stipend to employees to allow them to go out and purchase technology that they will use to do their jobs.  They can use their home computers, their iPhones, etc. to do their jobs if it meets pertinent and reasonable requirements.

It is the job of the IT and Security teams to provide a safe and reasonably secure computing environment to allow employees to do their jobs without putting the company in harm’s way.

This sort of program is taking off as companies realize that consumer, pro-sumer and enterprise technologies are colliding at velocity of change that makes it difficult to distinguish between them and the business benefits outweigh the downside.  In fact, my company has a business consulting practice that teaches other companies how to put these programs in place.

Most security professionals curl up in a fetal position (as I first did, admittedly) when considering this sort of program.  How are you dealing with the consumerization of IT within your company?

Up Next: Part III – The Examples Continue…

Categories: Disruptive Innovation Tags:

Security and Disruptive Innovation Part I: The Setup

November 8th, 2007 14 comments

Embraceinnovation
As a follow-on to my post on security and innovation here, I’m going to do a series based upon my keynote from ISD titled "Why Security Should Embrace Disruptive Technology" with a brief narrative of each slide’s talking points

The setup for the the talk was summarized nicely:

IT departments have spent the last 10+ years enabling users by delivering revolutionary technology and
delegating ownership and control of intellectual property and information
in order to promote agility, innovation and competitive advantage on
behalf of the business. Meanwhile IT Security has traditionally
focused on reigning in the limits of this technology in a belated
compliance-driven game of tug-of-war to apply control over the same sets
of infrastructure, intellectual property and data that is utilized freely
by the business.
  Christofer Hoff, chief architect for Security Innovation at Unisys and
former Security 7 winner, will highlight several areas of emerging and
disruptive technologies and practices that should be embraced, addressed,
and integrated into the security portfolios and strategic dashboards of
all forward looking, business-aligned risk managers. Many of these topics
are contentious when discussing their impact on security:
          
      

  • Outsourcing of Security
  • Consumerization of IT
  • Software as a Service (SaaS)
  • Virtualization
  • De-perimeterization
  • Information Centricity
  • Next Generation Distributed Data Centers

Hoff will discuss what you ought to already have thought about and how to
map these examples to predict what is coming next and explore this
classical illustration of the cyclical patterns of how history, evolving
business requirements, technology and culture repeatedly intersect on a
never-ending continuum and how this convergence ought to be analyzed as
part of the strategic security program of any company.

I will be highlighting each of the seven examples above as a series on how we should embrace disruptive innovation and integrate it into our strategic planning process so we can manage it as opposed to the other way around.  First the setup of the presentation:

1. What is Innovation?

Isd2007006
Innovation can simply be defined as people implementing new ideas to
creatively solve problems and add value. 

How you choose to define
"value" really depends upon your goal and how you choose to measure the
impact on the business you
serve.

Within the context of this discussion while there is certainly technical innovation in the security field — how to make security "better," "faster," or "cheaper," rather than focus on the latest piece of kit, I’m interested in exploring how disruptive technologies and innovative drivers from the intersection of business, culture, and economics can profoundly impact how, what, why and when you do what you do.

We are going to discuss how Security can and should embrace disruptive technology and innovation in a formulaic and process-oriented way with the lovely side effect of becoming more innovative in the process.

2. What is Disruptive Technology/Innovation?

Isd2007008Clayton Christensen coined this term and is known for his series of work in this realm.  He is perhaps best known for his books: The Innovator’s Solution and The Innovator’s Dilemma.

Christensen defined disruptive technology/innovation as "a technology, product or service
that ultimately overturns the dominant market leader, technology or
product."

This sort of event can happen quickly or gradually and can be
evolutionary or revolutionary in execution.  In many cases, the
technology itself is not the disruptive catalyst, but rather the
strategy, business model or marketing/messaging creates the disruptive
impact.  It can also be radical or evolutionary in nature.

3. Examples of Disruptive Technology

Isd2007009
Here are some examples from a general technology perspective that highlights disruptive technologies/innovation.

Mainframe computing was disrupted by mini computers and ultimately client-server desktop computing.  Long distance telephony was been broadly impacted by Internet telephony such as Skype and Vonage.  Apple’s iTunes has dramatically impacted the way music is purchased and enjoyed.  The list goes on.

The key takeaway here is that the dominant technologies and industries on the left often times didn’t see the forces on the right coming and when they did, it was already too late.   What’s really important is that we find a framework and a process by which we can understand how disruptive technology/innovation emerges.  This will allow us to try and tame the impact and harness disruption positively by managing it and our response to it.

4. Technology Evolution: The Theory of Punctuated Equilibrium

Isd2007011
I’m a really visual person, so I like to model things by analogy that spark non-linear connections for me to reinforce a point.  When I was searching for an analogy that described the evolution of technology and innovation, it became clear to me that this process was not linear at all.

Bob Warfield over at the SmoothSpan blog gave me this idea for an evolution analogy called the Theory of Punctuated Equilibrium that describes how development and evolution of reproducing species actually happens in big bursts followed by periods of little change rather than constant, gradual transformation.

This is really important because innovation happens in spurts and is then absorbed and assimilated, but forecasting the timing of these events is really important.

5.  Mobius Strips and the Cyclic Security Continuum (aka the Hamster Wheel of Pain)

Isd2007012 If we look at innovation within the Information Security space as an example, we see evidence of this punctuated equilibrium distributed across what appears to be a never ending continuum.  Some might suggest that it’s like a never-ending Mobius strip.

Security innovation (mostly in technology) has manifested itself over time by offering a diverse set of solutions for a particular problem which ultimately settles down over time with solution conformity and functional democratization.  A classic example is NAC or DLP; lots of vendors spool up in a frenzy and ultimately thin down when the problem becomes defined and solution diversity thins.

Warfield described this as a classic damped oscillation where big swings in thinking ultimately settle down until everything looks and sounds the same…until the next "big thing" occurs.

What is problematic, however, is when we have overlays of timing curves of technology, economics, business requirements and culture.  Take for example the (cyclic) evolution of compute models: we started with the mainframe which were displaced my minis, desktops and mobile endpoints.  This changed the models of computing and how data was produced, consumed, stored and managed.

Interestingly as data has become more and more distributed, we’re now trending back to centralizing the computing experience with big honking centralied virtualized servers, storage and desktops.  The applications and protocols remain somewhere in between…

So while one set of oscillations are dampening, another is peaking.  It’s no wonder why we find it difficult to arrive at a static model in a dynamic instance.

6. Using Projections/Studies/Surveys to Gain Clarified Guidance

Isd2007013
Trying to visualize this intersection of curves can be very taxing, so I like to use industry projections/surveys/studies to help clear the fog. Some folks love these things, others hate them.  We all use them for budget, however 😉

I like Gartner’s thematic consistency of their presentations, so I’m going to use several of their example snippets to highlight a more business-focused logical presentation of how impending business requirements will drive innovation and disruptive technology right to your doorstop.

As security practitioners we can use this information to stay ahead of the curve and not get caught flat-footed when disruptive innovation shows up because you’ll be prepared for it.

7. What CIO’s see as the Top 10 Strategic Technologies for 2008-2011

Isd2007014_2
Gartner defines  a strategic technology as  "…one with the potential for significant impact on the enterprise in the next three years. Factors that denote significant impact include a high potential for disruption to IT or the business, the need for a major dollar investment, or the risk of being late to adopt."

Check out this list of technologies that your CIO has said are the technology categories that will provide significant impact to their enterprise.  How many of them can you  identify as being addressed in alignment to the business as part of your security strategy for the next three years?

Of the roughly 50 security professionals queried by me thus far, most can only honestly answer that they are doing their best to get in front of at most 1 to 2 of them…rot roh.

8. What those same CIO’s see as their Top 10 Priorities for 2007

Isd2007015 If we drill down a level and investigate what business-focused priorities CIO’s have for 2007, the lump in most security manager’s throats becomes bigger.

Of these top ten business priorities, almost all of those same 50 CISO’s I polled had real difficulty in demonstrating how their efforts were in alignment to these priorities, except as a menial "insurance purchase" acting as a grudge-based cost of business.

It becomes readily apparent to most that being a cost of business does not put one in the light of being strategic.  In fact, the bottom line impact caused by the never-ending profit draining by security is often in direct competition with some of these initiatives.  Security contributing to revenue growth, customer retention, controlling operating costs?

Whoops…

9. And here’s how those CIO’s are investing their Technology Dollars in 2007…

Isd2007016
So now the story gets even more interesting.  If we take the Top 10 Strategic Technologies and hold that up against the Top 10 CIO Priorities, what we should see is a business-focused alignment of how one supports the other.

This is exactly what we get when we take a look at the investments in technology that CIO’s are making in 2007.

By the way, last year, "Security" was number one.  Now it’s number six.  I bet that next year, it may not even make the top ten.

This means that security is being classified as being less and less strategically important and is being seen as a feature being included in these other purchase/cost centers.  That means that unless you start thinking differently about how and what you do, you run the risk of becoming obsolete from a stand-alone budget perspective.

That lump in your throat’s getting pretty big now, huh?

10.  How Do I Start to Think About What/How My Security Investment Maps to the Business?  Cajun Food, Of Course!

Isd2007017 This is my patented demonstration of how I classify my security investments into a taxonomy that is based upon Cajun food recipes.

It’s referred to as "Hoff’s Jumbalaya Model" by those who have been victimized by its demonstration.  Mock it if you must, but it recently helped secure $21MM in late-stage VC funding…

Almost all savory Cajun dishes are made up of three classes of ingredients which I call: Foundational, Commodities and Distinguished.

Foundational ingredients are mature, high-quality and time-tested items that are used as the base for a dish.  You can’t make a recipe without using them and your choice of ingredients, preparation and cooking precision matter very much. 

Commodity ingredients are needed because without them, a dish would be bland.  However, the source of these ingredients is less of a concern given the diversity of choice and availability.  Furthermore, salt is salt — sure, you could use Fleur de Sel or Morton’s Kosher, but there’s not a lot of difference here.  One supplier could vanish and you’d have an alternative without much thought.

Distinguished ingredients are really what set a dish off.  If you’ve got a fantastic foundation combined with the requisite seasoning of commodity spices, adding a specific distinguished ingredient to the mix will complete the effort.  Andouille sausage, Crawfish, Alligator, Tasso or (if you’re from the South) Squirrel are excellent examples.  Some of these ingredients are hard to find and for certain dishes, very specific ingredients are needed for that big bang.

Bear with me now…

11. So What the Hell Does Jambalaya Have to Do with Security Technology?

Isd2007018 Our recipes for deploying security technology are just like making a pot of Jambalaya, of course! 

Today when we think about how we organize our spending and our deployment methodologies for security solutions, we’re actually following a recipe…even if it’s not conscious.

I’m going to use two large markets in intersection to demonstrate this.  Let’s overlay the service provider/mobile operator/telco. market and their security needs with that of the common commercial enterprise.

As with the Cajun recipe example, the go-to foundational ingredients that we based our efforts around are the mature, end-to-end, time-tested firewall and intrusion detection/prevention suites.  These ingredients have benefited from decades of evolution and are stable, mature and well-understood.  Quality is important as is the source.

In the case of either market space, short of scaling requirements, the SP/MSSP/MO/Telco and Enterprise markets both utilize common approaches and choices to satisfy their requirements.

Both markets also have many common overlapping sets of requirements and solution choices for the commoditizing ingredients.  In this case, except separated by scale and performance, there’s little difference the AV, Anti-Spam, or URL filtering functionality offered by the many vendors in the pool who supply these functions.  Vendor A could go out of business tomorrow and for the most part, Vendor B’s product could be substituted with the same functionality without much fuss.

Now, when we look at distinguished "ingredients," this is where we witness a bit of a divergence.  In the SP/MSSP/MO/Telco space, they have very specific requirements for solutions that are unique beyond just scale and performance.  Session Border Controllers and DDoS tools are an example.  In the enterprise, XML gateways and web application firewalls are key.  The point here is that these solutions are quite unique and are often the source of innovation and disruption.

Properly classifying your solutions into these categories allows one to demonstrate an investment strategy inline with the value it brings.  Some of these solutions start off being distinguished and can either become commoditzied quickly or ultimately make their way as features into the more stable and mature foundational ingredient class.

Keep this model handy…

12.  Mapping the Solution Classes (Ingredients) to a Technology/Innovation Curve: The Hype Cycle!

Isd2007019
So, remember the Theory of Punctuated Equilibrium and it’s damped oscillation visual?  Check out Gartner’s Hype Cycle…it’s basically the same waveform.

I use the Hype Cycle slightly differently than Gartner does.  The G-Men use this to demonstrate how technology can appear and transform in terms of visibility and maturity over time.  Technology can appear almost anywhere along this curve; some are born commoditized and/or never make it.  Some take a long time to become recognized as a mature technology for adoption.

Ultimately, you’d like to see a new set of innovative or disruptive solutions/technologies appear on the left, get an uptake, mellow out over time and ultimately transform from diversity to conformity.  You can use the cute little names for the blips and bunkers if you like, but keep this motion across the curve top of mind.

Now, I map the classifications of Foundational, Commodities and Distinguished across this map and lo and behold, what we see is that most of the examples I gave (and that you can come up with) can be classified and qualified across this curve.  This allows a security manager/CISO to take technology hype cycle overlays and map them to an easily demonstrated/visualized class of solutions and investment strategies that also can speak to their lifecycle.

The things you really need to keep an eye on from an emerging innovation/disruption perspective are those distinguished solutions over on the left, climbing the "Technology Trigger" and aiming for the "Peak of Inflated Expectations" prior to sliding down to the "Trough of Disillusionment."  I think Gartner missed a perfect opportunity by not including the "Chasm of Eternal Despair" 😉

We’re going to talk more about this later, but you can essentially take your portfolio of technology solutions and start to map those business drivers/technologies prioritized by your CIO and see how you measure up.  When you need to talk budget, you can easily demonstrate how you’re keeping pulse with the dynamics of the industry, managing innovation and how that translates to your spend and depreciation cycles. 

You shore up your investment in Foundational components, manage the Commodities over time (they should get cheaper) and as business sees fit, put money into incubating emerging technologies and innovation.

Up Next…Some Really Interesting Examples of Disruptive Technology/Innovation and how they impact Security…

Categories: Disruptive Innovation Tags: