Home > Data-Centric Security, De-Perimeterization, IP/Data Leakage, Risk Management, Unified Threat Management (UTM) > Profiling Data At the Network-Layer and Controlling It’s Movement Is a Bad Thing?

Profiling Data At the Network-Layer and Controlling It’s Movement Is a Bad Thing?

Carcrash
I’ve been watching what appears like a car crash in slow-motion and for some strange reason I share some affinity and/or responsibility for what is unfolding in the debate between Rory and Rob.

What motivated me to comment on this on-going exploration of data-centric security was Rory’s last post in which he appears to refer to some points I raised in my original post but still bent on the idea that the crux of my concept was tied to DRM:

So .. am I anti-security? Nope I’m extremely pro-security. My feeling
is however that the best way to implement security is in ways which
it’s invisable to users. Every time you make ordinary business people
think about security (eg, usernames/passwords) they try their darndest
to bypass those requirements.

That’s fine and I agree.  The concept of ADAPT is completely transparent to "users."  This doesn’t obviate the fact that someone will have to be responsible for characterizing what is important and relevant to the business in terms of "assets/data," attaching weight/value to them, and setting some policies regarding how to mitigate impact and ultimately risk.

Personally I’m a great fan of network segregation and defence in
depth at the network layer. I think that devices like the ones
crossbeam produce are very useful in coming up with risk profiles, on a
network by network basis rather than a data basis and
managing traffic in that way. The reason for this is that then the
segregation and protections can be applied without the intervention of
end-users and without them (hopefully) having to know about what
security is in place.

So I think you’re still missing my point.  The example I gave of the X-Series using ADAPT takes a combination of best-of-breed security software components such as FW, IDP, WAF, XML, AV, etc. and provides you with segregation as you describe.  HOWEVER, the (r)evolutionary delta here is that the ADAPT profiling of content set by policies which are invisible to the user at the network layer allows one to make security decisions on content in context and control how data moves.

So to use the phrase that I’ve seen in other blogs on this subject,
I think that the "zones of trust" are a great idea, but the zone’s
shouldn’t be based on the data that flows over them, but the
user/machine that are used. It’s the idea of tagging all that data with
the right tags and controlling it’s flow that bugs me.

…and thus it’s obvious that I completely and utterly disagree with this statement.  Without tying some sort of identity (pseudonymity) to the user/machine AND combining it with identifying the channels (applications) and the content (payload) you simply cannot make an informed decision as to the legitimacy of the movement/delivery of this data.

I used the case of being able to utilize client-side tagging as an extension to ADAPT, NOT as a dependency.  Go back and re-read the post; it’s a network-based transparent tagging process that attaches the tag to the traffic as it moves around the network.  I don’t understand why that would bug you?

So that’s where my points in the previous post came from, and I
still reckon their correct. Data tagging and parsing relies on the
existance of standards and their uptake in the first instance and then
users *actually using them* and personally I think that’s not going to
happen in general companies and therefore is not the best place to be
focusing security effort…

Please explain this to me?  What standards need to exist in order to tag data — unless of course you’re talking about the heterogeneous exchange and integration of tagging data at the client side across platforms?  Not so if you do it at the network layer WITHIN the context of the architecture I outlined; the clients, switches, routers, etc. don’t need to know a thing about the process as it’s done transparently.

I wasn’t arguing that this is the end-all-be-all of data-centric security, but it’s worth exploring without deadweighting it to the negative baggage of DRM and the existing DLP/Extrusion Prevention technologies and methodologies that currently exist.

ADAPT is doable and real; stay tuned.

/Hoff

  1. June 3rd, 2007 at 00:28 | #1

    Data Tagging requisites..

    Rational Security: Profiling Data At the Network-Layer and Controlling It's Movement Is a Bad Thing? Well I'm gong to try and answer Hoffs question on standards I think need to exist before ADAPT or any other data classification and security…

  2. June 3rd, 2007 at 02:14 | #2

    I spent several hours and risked divorce last night explaining these concepts generically in my blog.
    I did so very humbly (hence the series titles) and with deference to Rory. I'm afraid he is still misunderstanding us and confusing data-centric security with DRM. The issue is that of being inside or outside of a network. He is thinking outside of a network, with security applied only to the data, we are saying that the data-centric security requires a context in which to be applied.
    I'm prepared to keep on repeating the same things over and over again…

  1. No trackbacks yet.