Archive

Archive for the ‘Information Survivability’ Category

Incomplete Thought: Storage In the Cloud: Winds From the ATMOS(fear)

May 18th, 2009 1 comment

I never metadata I didn’t like…

I first heard about EMC’s ATMOS Cloud-optimized storage “product” months ago:

EMC Atmos is a multi-petabyte offering for information storage and distribution. If you are looking to build cloud storage, Atmos is the ideal offering, combining massive scalability with automated data placement to help you efficiently deliver content and information services anywhere in the world.

I had lunch with Dave Graham (@davegraham) from EMC a ways back and while he was tight-lipped, we discussed ATMOS in lofty, architectural terms.  I came away from our discussion with the notion that ATMOS was more of a platform and less of a product with a focus on managing not only stores of data, but also the context, metadata and policies surrounding it.  ATMOS tasted like a service provider play with a nod to very large enterprises who were looking to seriously trod down the path of consolidated and intelligent storage services.

I was really intrigued with the concept of ATMOS, especially when I learned that at least one of the people who works on the team developing it also contributed to the UC Berkeley project called OceanStore from 2005:

OceanStore is a global persistent data store designed to scale to billions of users. It provides a consistent, highly-available, and durable storage utility atop an infrastructure comprised of untrusted servers.

Any computer can join the infrastructure, contributing storage or providing local user access in exchange for economic compensation. Users need only subscribe to a single OceanStore service provider, although they may consume storage and bandwidth from many different providers. The providers automatically buy and sell capacity and coverage among themselves, transparently to the users. The utility model thus combines the resources from federated systems to provide a quality of service higher than that achievable by any single company.

OceanStore caches data promiscuously; any server may create a local replica of any data object. These local replicas provide faster access and robustness to network partitions. They also reduce network congestion by localizing access traffic.

Pretty cool stuff, right?  This just goes to show that plenty of smart people have been working on “Cloud Computing” for quite some time.

Ah, the ‘Storage Cloud.’

Now, while we’ve heard of and seen storage-as-a-service in many forms, including the Cloud, today I saw a really interesting article titled “EMC, AT&T open up Atmos-based cloud storage service:”

EMC Corp.’s Atmos object-based storage system is the basis for two cloud computing services launched today at EMC World 2009 — EMC Atmos onLine and AT&T’s Synaptic Storage as a Service.
EMC’s service coincides with a new feature within the Atmos Web services API that lets organizations with Atmos systems already on-premise “federate” data – move it across data storage clouds. In this case, they’ll be able to move data from their on-premise Atmos to an external Atmos computing cloud.

Boston’s Beth Israel Deaconess Medical Center is evaluating Atmos for its next-generation storage infrastructure, and storage architect Michael Passe said he plans to test the new federation capability.

Organizations without an internal Atmos system can also send data to Atmos onLine by writing applications to its APIs. This is different than commercial graphical user interface services such as EMC’s Mozy cloud computing backup service. “There is an API requirement, but we’re already seeing people doing integration” of new Web offerings for end users such as cloud computing backup and iSCSI connectivity, according to Mike Feinberg, senior vice president of the EMC Cloud Infrastructure Group. Data-loss prevention products from RSA, the security division of EMC, can also be used with Atmos to proactively identify confidential data such as social security numbers and keep them from being sent outside the user’s firewall.

AT&T is adding Synaptic Storage as a Service to its hosted networking and security offerings, claiming to overcome the data security worries many conservative storage customers have about storing data at a third-party data center.

The federation of data across storage clouds using API’s? Information cross-pollenization and collaboration? Heavy, man.

Take plays like Cisco’s UCS with VMware’s virtualization and stir in VN-Tag with DLP/ERM solutions and sit it on top of ATMOS…from an architecture perspective, you’ve got an amazing platform for service delivery that allows for some slick application of policy that is information centric.  Sure, getting this all to stick will take time, but these are issues we’re grappling with in our discussions related to portability of applications and information.

Settling Back Down to Earth

This brings up a really important set of discussions that I keep harping on as the cold winds of reality start to blow.

From a security perspective, storage is the moose on the table that nobody talks about.  In virtualized environments we’re interconnecting all our hosts to islands of centralized SANs and NAS.  We’re converging our data and storage networks via CNAs and unified fabrics.

In multi-tenant Cloud environments all our data ends up being stored similarly with the trust that segregation and security are appropriately applied.  Ever wonder how storage architectures never designed to do these sorts of things at scale can actually do so securely? Whose responsibility is it to manage the security of these critical centerpieces of our evolving “centers of data.”

So besides my advice that security folks need to run out and get their CCIE certs, perhaps you ought to sign up for a storage security class, too.  You can also start by reading this excellent book by Himanshu Dwivedi titled “Securing Storage.”

What are YOU doing about securing storage in your enterprise our Cloud engagements?  If your answer is LUN masking, here’s four Excedrin, call me after the breach.

/Hoff

The Quandary Of the Cloud: Centralized Compute But Distributed Data

January 7th, 2009 3 comments

Here's a theme I've been banging around for quite some time as it relates to virtualization, cloud computing and security.  I've never really sat down and written about it, however.

As we trend towards consolidating and (re)centralizing our computing platforms — both endpoints and servers — using virtualization and cloud computing as enablers to do so, we're also simultaneously dealing with the decentralization and distributed data sets that come with technologies such as Web2.0, mobility and exposure of APIs from cloud platforms.*

So here we are all frothed up as virtualization and cloud computing have, in a sense, led us back to the resource-based consolidation of the mainframe model with all it's centralized splendor and client virtualization/thin clients/compartmentalized remote access is doing the same thing for endpoints. 

But the interesting thing is that with Moore's Law, the endpoints are also getting more and more powerful even though we're dumbing them down and trying to make their exposure more limited despite the fact that they can still efficiently process and store data locally.

These models, one could argue, are diametrically opposed when describing how to secure the platforms versus the information that resides on or is utilized by them.  As the cyclic waffling between centralized versus distributed continues, the timing of how and where we adapt to securing them always lags behind.  Which do we focus on securing and where?  The host, centralized server, network.

The unfortunate answer is always "yes."

Remember this (simplified) model of how/where we secure things?
Youarehere

If you juxtapose the image above mentally with how I represent the centralized <–> distributed trends in IT below, it's no wonder we're always behind the curve.  The computing model technology changes much more quickly than the security technology and processes do, thus the disconnect:

Compute-data-access
I need to update the diagram above to split out the "computing" layer
into client and server as well as extend the data layer to reference
storage modalities also, but it gets the job done.

At any rate, it's probably obvious and common sense, but when explaining to people why I spend my time pointing out gaps with security in virtualization and cloud models, I found this useful.

/Hoff

* It's important to note that while I refer to/group cloud computing models as centralized, I understand they have a distributed element to them, also.  I would ask you to think about the multiple cloud overlays as centralized resources, regardless of how intrinsically "distributed" in processing/load balancing they may be.

P.S. I just saw an awesome post titled "The Rise of the Stupid Endpoint" on the vinternals blog that shares many of the same points, although much more eloquently.  Check it out here.  Awesome!

Jaquith: Data-Centric Security Requires Devolution, Not a Revolution

January 6th, 2009 1 comment

If I may be as bold to call Andy Jaquith a friend, I'll do so as I welcomed both his first research report and blog as an analyst for Forrester.

Andy's first topic — Data-Centric Security Requires Devolution, Not a Revolution — is a doozy, and an important one given the recent re-focus on information protection.  The notion of data-centric security has caused quite the stir over the last year with the maturation, consolidation and (some might say) commoditzation of certain marketspaces (DLP) into larger mainstream security product suites.

I will admit that I did not spend the $350 to read Andy's research.  As much as I like to support the ever-turning wheels of the analyst sausage machine, I'm going to upgrade to Apple's newly-announced iLife/iWork '09 bundle instead.  Sorry, Andy.  I'll buy you that beer instead.

However, Andy wrote a great blog entry summarizing the research here:

All of the enterprise's data must be secured… that is obvious. Enterprises have been trying to do this for years with e-mail filtering, hard disk encryption, data leak prevention (DLP) and other technologies. Every few years, another hot technology emerges. But what's less obvious is that the accepted way of tacking the problem — making IT Security the primary responsible party — isn't necessarily the most effective way to do it.

In the report, I take the position that devolution of responsibilities from IT Security to business units is the most important success factor. I'd urge you to read the report for yourself. But in short: as long as data security is just "an IT thing," it's virtually certain that the most accountable parties (BUs) will be able to wash their hands of any responsibility. Depending on the organization, the centralized approach tends to lead to two scenarios:

(1) IT throws up its hands, saying "it's too hard!" — guaranteeing that data security problems breed like rabbits
(2) IT dials up the data controls so tight that end-users and business units rebel against or subvert the controls — leading to even worse problems


What's worse? No controls, or too many? The truth lies somewhere in between, and results vary widely depending on who's accountable: the boss you already know and have a relationship with, or an amorphous cost center whose workers don't know what you do all day. Your boss knows what work products are appropriate to protect, and what aren't. IT Security's role should be supply the tools to enforce the businesses' wishes, not operate them themselves.

Want to secure enterprise data? Stop trying so hard, and devolve!

My only comments are that much like the X-Files, the truth is "out there."  It is most certainly somewhere in between as users and the business will always take the convenient path of least resistance and security will impose the iron fist. 

Securing information must be a cooperative effort that involves the broader adoption of pervasive discovery and classification capabilities across the entire information lifecycle.  The technology has to become as transparent as possible such that workflow isn't interrupted.  That's no easy task

Rich Mogull and I have been writing and presenting about this for quite some time, and we're making evolutionary progress, but not revolutionary progress.

To that point, I might have chosen a different by-line.  Instead of "devolution, not a revolution," I would suggest that perhaps "goverened delegation, not regulation" might be appropriate, too.

Can't wait for that iLife/iWork bundle!

/Hoff

GigaOm’s Alistair Croll on Cloud Security: The Sky Is Falling!…and So Is My Tolerance For Absurdity

December 14th, 2008 3 comments
Whatmeworry
I just read the latest blog of Alistair Croll from GigaOm titled "Cloud Security: The Sky Is Falling!" in which he suggests that we pillow-hugging security wonks ought to loosen our death grips on our data because not only are we flapping our worry feathers for nothing, but security in "the Cloud" will result in better security than we have today. 

It's an interesting assertion, really, that despite no innovative changes in the underpinnings of security technology, no advances in security architecture or models and no fundamental security operational enhancements besides the notion of enhanced "monitoring," that simply outsourcing infrastructure to a third party "in the cloud" will in some way make security "better," whatever version of "the Cloud" you may be describing:

I don’t believe that clouds themselves will cause the security breaches and data theft they anticipate; in many ways, clouds will result in better security. Here’s why:

    • Fewer humans – Most computer breaches are the result of human error; only 20-40 percent stem from technical malfunctions. Cloud operators that want to be profitable take humans out of the loop whenever possible.
    • Better tools – Clouds can afford high-end data protection and security monitoring tools, as well as the experts to run them. I trust Amazon’s operational skills far more than my own.
    • Enforced processes – You could probably get a co-worker to change your company’s IT infrastructure. But try doing it with a cloud provider without the proper authorization: You simply won’t be able to.
    • Not your employees — Most security breaches are committed by internal employees. Cloud operators don’t work for you. When it comes to corporate espionage, employees are a much more likely target.

    Of course it takes people to muck things up, it always has and always will.  Rushing to embrace a "new" computing model without the introduction of appropriately compensating controls, adapted risk assessment/management methodologies and practices will absolutely introduce new threats, vulnerabilities and risk at a pace driven by supposed economic incentives that have people initially foaming at their good fortune and then fuming when it all goes bad.

    This comes down to the old maxim: "guns don't kill people, people kill people."  Certainly "the Cloud" alone won't increase breaches and data theft, but using it without appropriate safeguards will.

    This is an issue of squeezing the balloon.  The problem doesn't change in volume, it just changes shape.

    Those of us concerned about security and privacy in cloud computing models have good reason to be concerned; we live with and have lived with these sorts of disruptive innovations and technology before and it really, really screws things up because the security models and technology we can lean on to manage risk is not adapted to this at all and the velocity of change eclipses our ability to do do our jobs competently.

    Further bonking things up is the very definition of "the Cloud(s)" in the first place.

    Despite the obvious differences in business models, use cases, technical architecture as well as the non-existence of a singularity called "The Cloud," this article generalizes and marginalizes the security challenges of cloud computing regardless.  In fact, it emphasizes on one leg of the IT stool (people) to the point of downplaying via the suspension of disbelief that the other two (process and technology) are problems less deserving of attention as they are magically addressed.

    To be fair, I can certainly see Alistair's argument holding water within the context of an SME/SMB with no dedicated expertise in security and little or no existing cost burden in IT infrastructure.  The premise: let your outsourced vendor provide you with the expertise in security you don't have as they have a vested interest to do so and can do it better than you.  

    The argument hinges on two things: that insiders intent on malicious activity by tampering with "infrastructure" are your biggest risk eliminated by "the cloud" and that infrastructure and business automation, heretofore highly sought after elements of enterprise modernization efforts, is readily available now and floating about in the cloud despite its general lack of availability in the enterprise.

    So here's what's amusing to me:
    1. It takes humans to operate the cloud infrastructure.  These human operators, despite automation, still suffer from the same scale and knowledge limitations as those in the real world.  Further the service governance layers that translate business process, context and risk into enforceable policy across a heterogeneous infrastructure aren't exactly mature. 
        
    2. The notion that better tools exist in the cloud that haven't as yet been deployed in the larger enterprise seems a little unbelievable.  Again, I agree that this may be the case in the SME/SMB, but it's simply not the case in larger enterprises.  Given issues such as virtualization (which not all cloud providers depend upon, but bear with me) which can actually limit visibility and reach, I'd like to understand what these tools are why we havent' heard of them before.
    3. The notion that you can get a co-worker to "…change your company's IT infrastructure" but you can't get this same event impact to occur in the cloud is ludicrous.  Besides the fact that the bulk of breaches result from abuse or escalation of privilege in operating systems and applications, not general "infrastructure," and   "the Cloud," having abstracted this general infratructure from view. leaves bare the ability to abuse the application layer just as ripely.
    4. Finally, Alaistair's premise that the bulk of attacks originate internally is misleading. Alistair's article was written a few days ago.  The Intranet Journal article he cites to bolster his first point substantiating his claim was written in 2006 and is based upon a study done by CompTIA in 2005.  2005!  That's a lifetime by today's standards. Has he read the Verizon breach study that empirically refutes many of his points? (*See Below in extended post)
     As someone who has been on both the receiving end as well as designed and operated managed (nee Cloud) security as a service for customers globally, there are a number of exceptions to Alistair's assertions regarding the operational security prowess in "the Cloud" with this being the most important: 

    As "the Cloud" provider adds customers, the capability to secure the infrastructure and the data transiting it, ultimately becomes an issue of scale, too. The more automation that is added, the more false positives show up, especially in light of the fact that the service provider has little or no context of the information, business processes or business impact that their monitoring tools observe.  You can get rid of the low-hanging fruit, but when it comes down to impacting the business, some human gets involved.

    The automation that Alastair asserts is one of the most important reasons why Cloud security will be better than non-Cloud security ultimately suffers from the same  lack of eyeballs problem that the enterprise supposedly has in the first place.

    For all the supposed security experts huddled around glowing monitors in CloudSOC's that are vigilantly watching over "your" applications and data in the Cloud, the dirty little secret is that they rely on basically the same operational and technical capabilities as enterprises deploy today, but without context for what it is they are supposedly protecting.  Some rely on less.  In fact, in some cases, unless they're protecting their own infrastructure, they don't do it at all — it's still *your* job to secure the stacks, they just deal with the "pipes."

    We're not all Chicken Little's, Alistair.  Some of us recognize the train when it's heading toward us at full speed and prefer not to be flattened by it, is all.

    /Hoff

    Read more…

    Gunnar Peterson Channels Tina Turner (Sort Of): What’s Happiness Got To Do With It?

    October 29th, 2008 1 comment

    Tinaturner
    Gunnar just hit a home run responding to John Pescatore's one line, twelve word summarization of how to measure a security program's effectiveness.  Read Gunnar's post in it's entirety but here's the short version:

    Pescatore says:

    The best security program is at the business with the happiest customers.


    To which Gunnar suggests:

    There's a fine line between happy customers and playing piano in a bordello.

    …and revises Pescatore's assertion to read:

    The best security program is at the business with sustainable competitive advantage.

    To which, given today's economic climate, I argue the following simplification:

    The best security program is at the business that is, itself, sustainable.

    I maintain that if, as John suggests, you want to introduce the emotive index of "happiness" and relate it to a customer's overall experience when interacting with your business, then the best security program is one that isn't seen or felt at all.  Achieving that Zen-like balance is, well, difficult.

    It's hard enough to derive metrics that adequately define a security program's effectiveness, value, and impact on risk.  Balanced scorecard or not, the last thing we need is the introduction of a satisfaction quotient that tries to quantify (on a scale from 1-10?) the "warm and fuzzies" a customer enjoys whilst having their endpoint scanned by a NAC device before attaching to your portal… 😉

    I understand what John was shooting for, but it's like suggesting that there's some sort of happiness I can achieve when I go shopping for car insurance.

    /Hoff

    Security Will Not End Up In the Network…

    June 3rd, 2008 9 comments

    Secdeadend
    It’s not the destination, it’s the journey, stupid.

    You can’t go a day without reading from the peanut gallery that it is
    "…inevitable that network security will eventually be subsumed into
    the network fabric."  I’m not picking on Rothman specifically, but he’s been banging this drum loudly of late.

    For such a far-reaching, profound and prophetic statement, claims like these are strangely myopic and inaccurate..and then they’re exactly right.

    Confused?

    Firstly, it’s sort of silly and obvious to trumpet that "network security" will end up in the "network."  Duh.  What’s really meant is that "information security" will end up in the network, but that’s sort of goofy, too. You’ll even hear that "host-based security" will end up in the network…so let’s just say that what’s being angled at here is that security will end up in the network.

    These statements are often framed within a temporal bracket
    that simply ignores the bigger picture and reads like a eulogy.  The reality is that historically
    we have come to accept that security and technology are
    cyclic and yet we continue to witness these terminal predictions defining an end state for security that has never arrived and never will.


    Let me make plain my point: there is no final resting place for where and how security will "end up."

    I’m visual, so let’s reference a very basic representation of my point.  This graph represents the cyclic transition over time of where and how
    we invest in security.

    We ultimately transition between host-based security,
    information-centric security and network security over time. 

    We do this little
    shuffle based upon the effectiveness and maturity of technology,
    economics, cultural, societal and regulatory issues and the effects of disruptive innovation.  In reality, this
    isn’t a smooth sine wave at all, it’s actually more a classic dampened
    oscillation ala the punctuated equilibrium theory I’ve spoken about
    before
    , but it’s easier to visualize this way.

    Youarehere_3

    Our investment strategy and where security is seen as being "positioned" reverses direction over time and continues ad infinitum.  This has proven itself time and time again yet we continue to be wowed by the prophetic utterances of people who on the one hand talk about these never-ending cycles and yet on the other pretend they don’t exist by claiming the "death" of one approach over another. 
     

    Why?

    To answer that let’s take a look at how the cyclic pendulum effect of our focus on
    security trends from the host to the information to the network and
    back again by analyzing the graph above. 

    1. If we take a look at the arbitrary "starting" point indicated by the "You Are Here" dot on the sine wave above, I suggest that over the last 2-3 years or so we’ve actually headed away from the network as the source of all things security.   

      There are lots of reasons for this; economic, ideological, technological, regulatory and cultural.  If you want to learn more about this, check out my posts on how disruptive Innovation fuels strategic transience.

      In short, the network has not been able to (and never will) deliver the efficacy, capabilities or
      cost-effectiveness desired to secure us from evil, so instead we look at
      actually securing the information itself.  The security industry messaging of late is certainly bearing testimony to that fact.  Check out this year’s RSA conference…
       

    2. As we focus then on information centricity, we see the resurgence of ERM, governance and compliance come into focus.  As policies proliferate, we realize that this is really hard and we don’t have effective and ubiquitous data
      classification, policy affinity and heterogeneous enforcement capabilities.  We shake our heads at the ineffectiveness of the technology we have and hear the cries of pundits everywhere that we need to focus on the things that really matter…

      In order to ensure that we effectively classify data at the point of creation, we recognize that we can’t do this automagically and we don’t have standardized schemas or metadata across structured and unstructured data, so we’ll look at each other, scratch our heads and conclude that the applications and operating systems need modification to force fit policy, classification and enforcement.

      Rot roh.
       

    3. Now that we have the concept of policies and classification, we need the teeth to ensure it, so we start to overlay emerging technology solutions on the host in applications and via the OS’s that are unfortunately non-transparent and affect the users and their ability to get their work done.  This becomes labeled as a speed bump and we grapple with how to make this less impacting on the business since security has now slowed things down and we still have breaches because users have found creative ways of bypassing technology constraints in the name of agility and efficiency…
       
    4. At this point, the network catches up in its ability to process closer to "line
      speed," and some of the data classification functionality from the host commoditizes into the "network" — which by then is as much in the form of appliances as it is routers and switches — and always
      will be.   So as we round this upturn focusing again on being "information centric," with the help of technology, we seek to use our network investment to offset impact on our users.
       
    5. Ultimately, we get the latest round of "next generation" network solutions which promise to deliver us from our woes, but as we "pass go and collect $200" we realize we’re really at the same point we were at point #1.

    ‘Round and ’round we go.

    So, there’s no end state.  It’s a continuum.  The budget and operational elements of who "owns" security and where it’s implemented simply follow the same curve.  Throw in disruptive innovation such as virtualization, and the entire concept of the "host" and the "network" morphs and we simply realize that it’s a shift in period on the same graph.

    So all this pontification that it is "…inevitable that network security will eventually be subsumed into
    the network fabric" is only as accurate as what phase of the graph you reckon you’re on.  Depending upon how many periods you’ve experienced, it’s easy to see how some who have not seen these changes come and go could be fooled into not being able to see the forest for the trees.

    Here’s the reality we actually already know and should not come to you as a surprise if you’ve been reading my blog: we will always need a blended investment in technology, people and process in order to manage our risk effectively.  From a technology perspective, some of this will take the form of controls embedded in the information itself, some will come from the OS and applications and some will come from the network.

    Anyone who tells you differently has something to sell you or simply needs a towel for the back of his or her ears…

    /Hoff

    GooglePOPs – Cloud Computing and Clean Pipes: Told Ya So…

    May 8th, 2008 9 comments

    In July of last year, I prognosticated that Google with it’s various acquisitions was entering the security space with the intent to not just include it as a browser feature for search and the odd GoogleApp, but a revenue-generating service delivery differentiator using SaaS via applications and clean pipes delivery transit in the cloud for Enterprises.

    My position even got picked up by thestreet.com.  By now it probably sounds like old news, but…

    Specifically, in my post titled "Tell Me Again How Google Isn’t Entering the Security Market? GooglePOPs will Bring Clean Pipes…" I argued (and was ultimately argued with) that Google’s $625M purchase of Postini was just the beginning:

    This morning’s news that Google is acquiring Postini for $625 Million dollars doesn’t surprise me at all and I believe it proves the point.

    In fact, I reckon that in the long term we’ll see the evolution of the Google Toolbar morph into a much more intelligent and rich client-side security application proxy service whereby Google actually utilizes client-side security of the Toolbar paired with the GreenBorder browsing environment and tunnel/proxy all outgoing requests to GooglePOPs.

    What’s a GooglePOP?

    These GooglePOPs (Google Point of Presence) will house large search and caching repositories that will — in conjunction with services such as those from Postini — provide a "clean pipes service to the consumer.  Don’t forget utility services that recent acquisitions such as GrandCentral and FeedBurner provide…it’s too bad that eBay snatched up Skype…

    Google will, in fact, become a monster ASP.  Note that I said ASP and not ISP.  ISP is a commoditized function.  Serving applications and content as close to the user as possible is fantastic.  So pair all the client side goodness with security functions AND add GoogleApps and you’ve got what amounts to a thin client version of the Internet.

    Here’s where we are almost a year later.  From the Ars Technica post titled "Google turns Postini into Google Web Security for Enterprise:"

    The company’s latest endeavor, Google Web Security for Enterprise, is now available, and promises to provide a consistent level of system security whether an end-user is surfing from the office or working at home halfway across town.

    The new service is branded under Google’s "Powered by Postini" product line and, according to the company, "provides real-time malware protection and URL filtering with policy enforcement and reporting. An additional feature extends the same protections to users working remotely on laptops in hotels, cafes, and even guest networks." The service is presumably activated by signing in directly to a Google service, as Google explicitly states that workers do not need access to a corporate network.

    The race for cloud and secure utility computing continues with a focus on encapsulated browsing and application delivery environments, regardless of transport/ISP, starting to take shape.   

    Just think about the traditional model of our enterprise and how we access our resources today turned inside out as a natural progression of re-perimeterization.  It starts to play out on the other end of the information centricity spectrum.

    What with the many new companies entering this space and the likes of Google, Microsoft and IBM banging the drum, it’s going to be one interesting ride.

    /Hoff

    Of Course Defense-In-Depth, er, Defense-In-Breadth Works!

    May 7th, 2008 6 comments

    I don’t know what the the hell Ptacek and crew are on about.  Of course defense-in-depth defense-in-breadth is effective.  It’s heresy to suggest otherwise.  Myopic, short-sighted, and heretical, I say!

    In support, I submit into evidence People’s Exhibit #1, from here your honor:

    Tsa20layers_2

    …and I quoteth:

    We use layers of security to ensure the security of the traveling public and the Nation’s transportation system.

    Each one of these layers alone is capable of stopping a terrorist attack. In combination their security value is multiplied, creating a much stronger, formidable system.  A terrorist who has to overcome multiple security layers in order to carry out an attack is more likely to be pre-empted, deterred, or to fail during the attempt.

    Yeah!  Get some! It’s just like firewalls, IPS, and AV, bitches!  Mo’ is betta!

    It’s patently clear that Ptacek simply doesn’t layer enough, is all.  See, Rothman, you don’t need to give up!

    "Twenty is the number and the number shall be twenty!"

    How’s that for a metric?

    That is all.

    /Hoff

    Asset Focused, Not Auditor Focused

    May 3rd, 2008 5 comments

    Grcsoup
    Gunnar Peterson wrote a great piece the other day on the latest productization craze in InfoSec – GRC (Governance, Risk Management and Compliance) wherein he asks "GRC – To Be or To Do?"

    I don’t really recall when or from whence GRC sprung up as an allegedly legitimate offering, but to me it seems like a fashionably over-sized rug under which the existing failures of companies to effectively execute on the individual G, R, and C initiatives are conveniently being swept.

    I suppose the logic goes something like this: "If you cant effectively
    govern, manage risk or measure compliance it must be because what you’re doing is fragmented and siloed.  What you need is
    a product/framework/methodology that takes potentially digestible
    deliverables and perspectives and "evolves" them into a behemoth suite instead?"

    I do not dispute that throughout most enterprises, the definitions, approaches and processes in managing each function are largely siloed and fragmented and I see the attractiveness of integrating and standardizing them, but  I am unconvinced that re-badging a control and policy framework collection constitutes a radical new approach. 

    GRC appears to be a way to sell more products and services under a fancy new name to address problems rather than evaluate and potentially change the way in which we solve them.  Look at who’s pushing this: large software companies and consultants as well as analysts looking to pin their research to something more meaningful.

    From a first blush, GRC isn’t really about governance or managing risk.  It’s audit-driven compliance all tarted up.

    It’s a more fashionable way of getting all your various framework and control definitions in one place and appealing to an auditor’s desire for centralized "stuff" in order to document the effectiveness of controls and track findings against some benchmark.  I’m not really sure where the business-driven focus comes into play?

    It’s also sold as a more efficient way of reducing the scope and costs of manual process controls.  Fine.  Can’t argue with that.  I might even say it’s helpful, but at what cost?

    Gunnar said:

    GRC (or Governance, Risk Management, and Compliance for
    the uninitiated) is all the rage, but I have to say I think that again
    Infosec has the wrong focus.

    Instead of Risk Management helping to deliver transparent Governance and as a natural by-product demonstrate compliance as a function of the former, the model’s up-ended with compliance driving the inputs and being mislabeled.

    As I think about it, I’m not sure GRC would be something a typical InfoSec function would purchase or use unless forced which is part of the problem.  I see internal audit driving the adoption which given today’s pressures (especially in public companies) would first start in establishing gaps against regulatory compliance.

    If the InfoSec function is considering an approach that drives protecting the things that matter most and managing risk to an acceptable level and one that is not compliance-driven but rather built upon a business and asset-driven approach, rather than make a left turn Gunnar suggested:

    Personally, I am happy sticking to classic infosec knitting – delivering confidentiality, integrity, and availability through authentication, authorization, and auditing. But if you are looking for a next generation conceptual horse to bet on, I don’t think GRC is it, I would look at information survivability. Hoff’s information survivability primer is a great starting point for learning about survivability.

    Why survivability is more valuable over the long haul than GRC is that survivability is focused on assets not focused on giving an auditor what they need, but giving the business what it needs.

    Seminal paper on survivability by Lipson, et al. "survivability solutions are best understood as risk management strategies that first depend on an intimate knowledge of the mission being protected." Make a difference – asset focus, not auditor focus.

    For obvious reasons, I am compelled to say "me, too."

    I would really like to talk to someone in a large enterprise who is using one of these GRC suites — I don’t really care which department you’re from.  I just want to examine my assertions and compare them against my efforts and understanding.

    /Hoff

    Welcome To the Information Survivability/Sustainability/Centricity Circus…

    May 3rd, 2008 No comments

    Beardedlady
    Forget "Security Theater."  The "Security Circus" is in town…

    I wrote this some time ago and decided that I didn’t like the tone as it just came out as another whiny complaint against the "man."  I’m in a funny mood as I hit a threshold yesterday with all the so-called experts coming out of the woodwork lately, so I figured I’d post it because it made me chortle. 

    They Shoot Horses, Don’t They?

    To answer what seems to be a question increasing in frequency due to the surge in my blog’s readership lately, as well as being cycled through the gossip mill, I did not change the name of my blog from "Rational Security" to "Rational Survivability" due to IBM’s Val Rahmani’s charming advertisement keynote at RSA.  😉

    One might suggest that Val’s use of the mythological reference to Sisyphus wasn’t as entertaining as Noonan’s "security as the width of two horses’ asses" keynote from a couple of years ago, but her punchline served to illustrate the sad state of Information Security, even if it also wanted to make me shoot myself.

    Val’s shocking admission that IBM was "…exiting the security business,"
    that "…information security was dead," and that we should all
    celebrate by chanting "…long live [information] sustainability!" 

    This caused those of us here at Rational Survivability HQ to bow our heads in a moment of silence for the passing of yet another topical meme and catchphrase that has now been "legitimized" by industry and thus must be put out of its misery and never used again.

    You say "tomato," I say "tomato…"

    Yeah, you might argue that "sustainability" is more business-focused
    and less military-sounding than "survivability," but it’s really about
    the same concepts. 

    I’m not going to dissect her speech because that’s been done.  I have said most of what I have to say on this concept in my posts on Information Survivability and honestly, I think they are as relevant as ever. 

    You can read the first one here and follow on with the some more, here. 

    For those of you who weren’t around when it happened, I changed the name of my blog over six months ago to illustrate what is akin to the security industry’s equivalent of an introduction at an AA meeting and was so perfectly illustrated by Val’s fireside chat. 

    You know the scene.  It’s where an alcoholic stands up and admits his or her weaknesses for a vice amongst an audience of current and "former" addicts.  Hoping for a collective understanding of one’s failure and declaring the observed days of committed sobriety to date,  the goal is to convince oneself and those around you that the counter’s been reset and you’ve really changed.  Despite the possibility of relapse at any moment, the declaration of intent — the will to live sober — is all one needs.

    That and a damned good sponsor.

    And now for something completely different!

    Circustent
    That was a bloody depressing analogy, wasn’t it?  Since this was supposed to be a happy occasion, I found myself challenged to divine an even worse analogy for your viewing pleasure.   Here goes.

    That’s right.  I’m going to violate the Prime Directive and go right with the patented Analog Of Barnum & Bailey’s Circus:

    What Information Security has become is the equivalent of a carnie’s dancing poodle in the circus tent of industry. 

    Secretly we want to see the tigers eat the dude with the whip, but
    we cheer when he makes them do the Macarena anyway. 

    We all know that one day, that little Romanian kid on
    the trapeze is going to miss the triple-lindy and crash to the floor
    sans net, but we’re not willing to do anything about it and it’s the tension that makes the act work, despite the exploitative child labor practices and horrible costumes.

    We pump $180 in tokens into the ring toss to win an $11 stuffed animal, because it’s the effort that counts, not the price.

    We’re all buying tickets, suffering through the stupid antics of the clowns piling out of the tiny little car in the spotlight hoping that the elephant act at the end of the show is going to be worth the price of admission. 

    At the end of the night, we leave exhausted, disappointed, broke and smelling like sweaty caramel apples and stale pretzels…wondering when they’ll be back next year so we can take the kids.

    See, I told you it was awful.  But you know what’s much worse than my shitty little clown analogy? 

    Reality.

    Come one, come all.  Let Me Guess Your Weight!

    So in today’s time of crappy economics when money is hard to come by,
    it’s now as consumers that we start to pay attention to these practices
    — this circus.  It’s now that we start to demand that these alleged
    predatory vendors actually solve our business problems and attend to
    our issues rather than simply recycle the packaging.

    So when life hands vendors a lemon, they make marketingade, charge us $4.50 a pop and we still drink it.

    Along those lines, many mainstream players have now begun to work
    their marketing sideshows by pitching the supposedly novel themes of
    sustainability, survivability, or information centricity.  It’s a surreptitiously repentant admission that all the peanuts and popcorn they’ve been selling us while all along we ooh and ahh at the product equivalents of the bearded lady, werewolf children and the world’s tallest man still climax at the realization that it’s all just an act.

    At the end of the night, they count their money, tear down the tents and move on.  When the bearded lady gets a better gig, she bails and they bring in the dude with the longest mustache.  Hey, hair is hair; it’s just packaged differently, and we go to ogle at the newest attraction.

    There’s no real punchline here folks, just the jaded, bitter and annoyed comments of someone who’s becoming more and more like the grumpy folks he always made fun of at bingo night and a stark realization of just how much I hate the circus.

    /Hoff