Home > Information Centricity, Information Security, Information Survivability, Innovation, Punditry, Security Predictions > And Now Some Useful 2008 Information Survivability Predictions…

And Now Some Useful 2008 Information Survivability Predictions…

So, after the obligatory dispatch of gloom and doom as described in my
2008 (in)Security Predictions, I’m actually going to highlight some of
the more useful things in the realm of Information Security that I
think are emerging as we round the corner toward next year.

They’re not really so much predictions as rather some things to watch.

Unlike folks who can only seem to talk about desperation, futility
and manifest destiny or (worse yet) "anti-pundit pundits" who try to
suggest that predictions and forecasting are useless (usually because
they suck at it,) I gladly offer a practical roundup of impending
development, innovation and some incremental evolution for your

You know, good news.

As Mogull mentioned,
I don’t require a Cray XMP48, chicken bones & voodoo or a
prehensile tail to make my picks.  Rather I grab a nice cold glass of
Vitamin G (Guiness) and sit down and think for a minute or two,
dwelling on my super l33t powers of common sense and pragmatism with just a
pinch of futurist wit.

Many of these items have been underway for some time, but 2008 will
be a banner year for these topics as well as the previously-described
"opportunities for improvement…"

That said, let’s roll with some of the goodness we can look forward to in the coming year.  This is not an exhaustive list by any means, but some examples I thought were important and interesting:

  1. More robust virtualization security toolsets with more native hypervisor/vmm accessibility
    it didn’t start with the notion of security baked in, virtualization
    for all of its rush-to-production bravado will actually yield some
    interesting security solutions that help tackle some very serious
    challenges.  As the hypervisors become thinner, we’re going to see the
    management and security toolsets gain increased access to the guts of
    the sausage machine in order to effect security appropriately and this
    will be the year we see the virtual switch open up to third parties and
    more robust APIs for security visibility and disposition appear.
  2. The focus on information centric security survivability graduates from v1.0 to v1.1
    to secure the network and the endpoint is like herding cats and folks
    are tired of dumping precious effort on deploying kitty litter around
    the Enterprise to soak up the stinky spots.  Rather, we’re going to see
    folks really start to pay attention to information classification,
    extensible and portable policy definition, cradle-to-grave lifecycle
    management, and invest in technology to help get them there.

    the current maturity of features/functions such as NAC and DLP have
    actually helped us get closer to managing our information and
    information-related risks.  The next generation of these offerings in
    combination with many of the other elements I describe herein and their
    consolidation into the larger landscape of management suites will
    actually start to deliver on the promise of focusing on what matters —
    the information.

  3. Robust Role-based policy, Identity and access management coupled with entitlement, geo-location and federation…oh and infrastructure, too!
    getting closer to being able to affect policy not only based upon just
    source/destination IP address, switch and router topology and the odd entry in active directory on
    a per-application basis, but rather holistically based upon robust
    lifecycle-focused role-based policy engines that allow us to tie in all of the major
    enterprise components that sit along the information supply-chain.

    Who, what, where, when, how and ultimately why will be the decision
    points considered with the next generation of solutions in this space.
    Combine the advancements here with item #2 above, and someone might
    actually start smiling.

    If you need any evidence of the convergence/collision of the application-oriented with the network-oriented approach and a healthy overlay of user entitlement provisioning, just look at the about-face Cisco just made regarding TrustSec.  Of course, we all know that it’s not a *real* security concern/market until Cisco announces they’ve created the solution for it 😉

  4. Next Generation Networks gain visibility as they redefine the compute model of today
    as there exists a Moore’s curve for computing, there exists an
    overlapping version for networking, it just moves slower given the
    footprint.  We’re seeing the slope of this curve starting to trend up
    this coming year, and it’s much more than bigger pipes, although that
    doesn’t hurt either…

    These next generation networks will
    really start to emerge visibly in the next year as the existing
    networking models start to stretch the capabilities and capacities of
    existing architecture and new paradigms drive requirements that dictate
    a much more modular, scalable, resilient, high-performance, secure and
    open transport upon which to build distributed service layers.

    networks and service layers are designed, composed, provisioned,
    deployed and managed — and how that intersects with virtualization and
    grid/utility computing — will start to really sink home the message
    that "in the cloud" computing has arrived.  Expect service providers
    and very large enterprises to adapt these new computing climates first
    with a trickle-down to smaller business via SaaS and hosted service
    operators to follow.

    BT’s 21CN
    (21st Century Network) is a fantastic example of what we can expect
    from NGN as the demand for higher speed, more secure, more resilient and more extensible interconnectivity really
    takes off.

  5. Grid and distributed utility computing models will start to creep into security
    really interesting by-product of the "cloud compute" model is that as
    data, storage, networking, processing, etc. get distributed, so shall
    security.  In the grid model, one doesn’t care where the actions take
    place so long as service levels are met and the experiential and
    business requirements are delivered.  Security should be thought of in
    exactly the same way. 

    The notion that you can point to a
    physical box and say it performs function ‘X’ is so last Tuesday.
    Virtualization already tells us this.  So, imagine if your security
    processing isn’t performed by a monolithic appliance but instead is
    contributed to in a self-organizing fashion wherein the entire
    ecosystem (network, hosts, platforms, etc.) all contribute in the
    identification of threats and vulnerabilities as well as function to
    contain, quarantine and remediate policy exceptions.

    Sort of sounds like that "self-defending network" schpiel, but not focused on the network and with common telemetry and distributed processing of the problem.

    Check out Red Lambda’s cGrid technology for an interesting view of this model.

  6. Precision versus accuracy will start to legitimize prevention as
    the technology starts to allow us the confidence to start turning the
    corner beyond detection

    In a sad commentary on the last few
    years of the security technology grind, we’ve seen the prognostication
    that intrusion detection is dead and the deadpan urging of the security
    vendor cesspool convincing us that we must deploy intrusion prevention
    in its stead. 
    Since there really aren’t many pure-play intrusion detection systems
    left anyway, the reality is that most folks who have purchased IPSs
    seldom put them in in-line mode and when they do, they seldom turn on
    the "prevention" policies and instead just have them detect attacks,
    blink a bit and get on with it.

    Why?  Mostly because while the
    threats have evolved the technology implemented to mitigate them hasn’t
    — we’re either stuck with giant port/protocol colanders or
    signature-driven IPSs that are nothing more than IDSs with the ability
    to send RST packets.

    So the "new" generation of technology has
    arrived and may offer some hope of bridging that gap.  This is due to
    not only really good COTS hardware but also really good network
    processors and better software written (or re-written) to take
    advantage of both.  Performance, efficacy and efficiency have begun to
    give us greater visibility as we get away from making decisions based
    on ports/protocols (feel free to debate proxies vs. ACLs vs. stateful
    inspection…) and move to identifying application usage and getting us
    close to being able to make "real time" decisions on content in context
    by examining the payload and data.  See #2 above.

    precision versus accuracy discussion is focused around being able to
    really start trusting in the ability for prevention technology to
    detect, defend and deter against "bad things" with a fidelity and
    resolution that has very low false positive rates.

    We’re getting closer with the arrival of technology such as Palo Alto Network’s solutions
    — you can call them whatever you like, but enforcing both detection
    and prevention using easy-to-define policies based on application (and
    telling the difference between any number of apps all using port
    80/443) is a step in the right direction.

  7. The consumerization of IT will cause security and IT as we know it to die radically change
    I know it’s heretical but 2008 is going to really push the limits of
    the existing IT and security architectures to their breaking points, which is
    going to mean that instead of saying "no," we’re going to have to focus
    on how to say "yes, but with this incremental risk" and find solutions for an every increasingly mobile and consumerist enterprise. 

    We’ve talked about this before, and most security folks curl up into a fetal position when you start mentioning the adoption by the enterprise of social
    neworking, powerful smartphones, collaboration tools, etc.  The fact is that the favorable economics, agility , flexibility and efficiencies gained with the adoption of consumerization of IT outweigh the downsides in the long run.  Let’s not forget the new generation of workers entering the workforce. 

    So, since information is going to be leaking from our Enterprises like a sieve on all manners of devices and by all manner of methods, it’s going to force our hands and cause us to focus on being information centric and stop worrying about the "perimeter problem," stop focusing on the network and the host, and start dealing with managing the truly important assets while allowing our employees to do their jobs in the most effective, collaborative and efficient methods possible.

    This disruption will be a good thing, I promise.  If you don’t believe me, ask BP — one of the largest enterprises on the planet.  Since 2006 they’ve put some amazing initiatives into play:

    like this little gem:

    Oil giant BP is pioneering a "digital consumer" initiative
    that will give some employees an allowance to buy their own IT
    equipment and take care of their own support needs.

    project, which is still at the pilot stage, gives select BP staff an
    annual allowance — believed to be around $1,000 — to buy their own
    computing equipment and use their own expertise and the manufacturer’s
    warranty and support instead of using BP’s IT support team.

    to the scheme is tightly controlled and those employees taking part
    must demonstrate a certain level of IT proficiency through a computer
    driving licence-style certification, as well as signing a diligent use

    …combined with this:

    than rely on a strong network perimeter to secure its systems, BP has
    decided that these laptops have to be capable of coping with the worst
    that malicious hackers can throw at it, without relying on a network

    Ken Douglas, technology director of BP, told the UK
    Technology Innovation & Growth Forum in London on Monday that
    18,000 of BP’s 85,000 laptops now connect straight to the internet even
    when they’re in the office.

  8. Desktop Operating Systems become even more resilient
    The first steps taken by Microsoft and Apple in Vista and OS X (Leopard) as examples have begun to
    chip away at plugging up some of the security holes that
    have plagued them due to the architectural "feature" that providing an open execution runtime model delivers.  Honestly, nothing short of a do-over will ultimately mitigate this problem, so instead of suggesting that incremental improvement is worthless, we should recognize that our dark overlords are trying to makethings better.

    Elements in Vista such as ASLR, NX, and UAC combined with integrated firewalling, anti-spyware/anti-phishing, disk encryption, integrated rights management, protected mode IE mode, etc. are all good steps in a "more right" direction than previous offerings.  They’re in response to lessons learned.

    On the Mac, we also see ASLR, sandboxing, input management, better firewalling, better disk encryption, which are also notable improvements.  Yes, we’ve got a long way to go, but this means that OS vendors are paying more attention which will lead to more stable and secure platforms upon which developers can write more secure code.

    It will be interesting to see how the intersection of these "more secure" OS’s factor with virtualization security discussed in #1 above.

    Vista SP1 is due to ship in 2008 and will include APIs through which third-party security products can work with kernel patch protection on Vista
    x64, more secure BitLocker drive encryption and a better Elliptical Curve Cryptography PRNG (pseudo-random number generator.)  Follow-on releases to Leopard will likely feature security enhancements to those delivered this year.

  9. Compliance stops being a dirty word  & Risk Management moves beyond buzzword
    we typically see the role of information security described as blocking and tackling; focused on managing threats and
    vulnerabilities balanced against the need to be "compliant" to some
    arbitrary set of internal and external policies.  In many people’s
    assessment then, compliance equals security.  This is an inaccurate and
    unfortunate misunderstanding.

    In 2008, we’ll see many of the functions of security — administrative, policy and operational — become much more visible and transparent to the business and we’ll see a renewed effort placed on compliance within the scope of managing risk because the former is actually a by-product of a well-executed risk management strategy.

    We have compliance as an industry today because we manage technology threats and vulnerabilities and don’t manage risk.  Compliance is actually nothing more than a way of forcing transparency and plugging a gap between the two.  For most, it’s the best they’ve got.

    What’s traditionally preventing the transition from threat/vulnerability management to risk management is the principal focus on technology with a lack of a good risk assessment framework and thus a lack of understanding of business impact.

    The availability of mature risk assessment frameworks (OCTAVE, FAIR, etc.) combined with the maturity of IT and governance frameworks (CoBIT, ITIL) and the readiness of the business and IT/Security cultures to accept risk management as a language and actionset with which they need to be conversant will yield huge benefits this year.

    Couple that with solutions like Skybox and you’ve got the makings of a strategic risk management strategy that can bring the security more closely aligned to the business.

  10. Rich Mogull will, indeed, move in with his mom and start speaking Klingon
    ’nuff said.

So, there we have it.  A little bit of sunshine in your otherwise gloomy day.


  1. December 20th, 2007 at 13:27 | #1

    How important do you think Virtualization will be as the years go on? Has it hit its bubble?

  1. No trackbacks yet.