Archive for November, 2008

When The Carrot Doesn’t Work, Try a Stick: VMware Joins PCI SSC…

November 12th, 2008 1 comment

I've made no secret of my displeasure with the PCI Security Standards Council's lack of initiative when it comes to addressing the challenges and issues associated with virtualization and PCI compliance*. 

My last post on the topic  brought to light an even more extreme example of the evolution of virtualization's mainstream adoption and focused on the implications that cloud computing brings to bear when addressing the PCI DSS.

I was disheartened to find that upon inquiring as to status of the formation of and participation in a virtualization-specific special interest group (SIG,) the SSC's email response to me was as follows:

On Oct 29, 2008, at 1:24 PM, PCI Participation wrote:

Hello Christofer,

Thank you for contacting the PCI Security Standards Council. At this
time, there is currently no Virtualization SIG.
The current SIGs are
Pre-Authorization and Wireless.

Please let us know if you are interested in either of those groups.

The PCI Security Standards Council

—–Original Message—–
From: Christofer Hoff []
Sent: Wednesday, October 29, 2008 12:58 PM
To: PCI Participation
Subject: Participation in the PCI DSS Virtualization SIG?

How does one get involved in the PCI DSS Virtualization SIG?


Christofer Hoff

The follow-on email to that said there were no firm plans to form a virtualization SIG. <SIGh>

So assuming that was the carrot approach, I'm happy to see that VMware has taken the route that only money, influence and business necessity can bring: the virtualization vendor 'stick.'  To wit (and a head-nod to David Marshall🙂

VMware is Joining PCI Security Standards Council as Participating Organization

VMware, the global leader in virtualization solutions from the
desktop to the datacenter, announced today that it is joining the PCI
Security Standards Council. As a participating organization, VMware
will work with the council to evolve the PCI Data Security Standard
(DSS) and other payment card data protection standards. This will help
those VMware customers in the retail industry who are required to meet
these standards to remain compliant while leveraging VMware
virtualization. VMware has also launched the VMware Compliance Center Web site,
an initiative to help educate merchants and auditors about how to
achieve, maintain and demonstrate compliance in virtual environments to
meet a number of industry standards, including the PCI DSS.

As a participating organization, VMware will now have access to the
latest payment card security standards from the council, be able to
provide feedback on the standards and become part of a growing
community that now includes more than 500 organizations. In an era of
increasingly sophisticated attacks on systems, adhering to the PCI DSS
represents a significant aspect of an entitys protection against data criminals. By joining as a participating organization, VMware is adding its voice to the process.

The PCI Security Standards Council is committed to helping everyone involved in the payment chain protect consumer payment data, said Bob Russo, general manager of the PCI Security Standards Council. By participating in the standards setting process, VMware demonstrates it is playing an active part in this important end goal.

Let's see if this leads to the formation of a virtualization SIG or at least a timetable for when the DSS will be updated with virtualization-specific guidelines.   I'd like to see other virtualization vendors also become participating organizations in the PCI SSC.


* Here are a couple of my other posts on PCI compliance and virtualization:

Categories: Virtualization, VMware Tags:

I Can Haz TCG IF-MAP Support In Your Security Product, Please…

November 10th, 2008 3 comments

In my previous post titled "Cloud Computing: Invented By Criminals, Secured By ???" I described the need for a new security model, methodology and set of technologies in the virtualized and cloud computing realms built to deal with the dynamic and distributed nature of evolving computing:

basically means that we should distribute the sampling, detection and
prevention functions across the entire networked ecosystem, not just to
dedicated security appliances; each of the end nodes should communicate
using a standard signaling and telemetry protocol so that common
threat, vulnerability and effective disposition can be communicated up
and downstream to one another and one or more management facilities.

Greg Ness from Infoblox reminded me in the comments of that post of something I was very excited about when it
became news at InterOp this last April: the Trusted Computing Group's (TCG) extension to the Trusted Network Connect (TNC) architecture called IF-MAP.

IF-MAP is a standardized real-time publish/subscribe/search mechanism which utilizies a client/server, XML-based SOAP protocol to provide information about network security objects and events including their state and activity:

IF-MAP extends the TNC architecture to support standardized, dynamic data interchange among a wide variety of networking and security components, enabling customers to implement multi-vendor systems that provide coordinated defense-in-depth.
Today’s security systems – such as firewalls, intrusion detection and prevention systems, endpoint security systems, data leak protection systems, etc. – operate as “silos” with little or no ability to “see” what other systems are seeing or to share their understanding of network and device behavior. 

This limits their ability to support coordinated defense-in-depth. 
In addition, current NAC solutions are focused mainly on controlling
network access, and lack the ability to respond in real-time to
post-admission changes in security posture or to provide visibility and
access control enforcement for unmanaged endpoints.  By extending TNC
with IF-MAP, the TCG is providing a standard-based means to address
these issues and thereby enable more powerful, flexible, open network
security systems.

While the TNC was initially designed to support NAC solutions, extending the capabilities to any security product to subscribe to a common telemetry and information exchange/integration protocol is a fantastic idea.


I'm really interested in how many vendors outside of the NAC space are including IF-MAP in their roadmaps. While IF-MAP has potential in convential non-virtualized infrastructure, I see a tremendous need for it in our move to Infrastructure 2.0 with virtualization and Cloud Computing. 

Integrating, for example, IF-MAP with VM-Introspection capabilities (in VMsafe, XenAccess, etc.) would be fantastic as you could tie the control planes of the hypervisors, management infrastructure, and provisioning/governance engines with that of security and compliance in near-time.

You can read more about the TCG's TNC IF-MAP specification here.



Cloud Computing: Invented By Criminals, Secured By ???

November 3rd, 2008 10 comments

I was reading Reuven Cohen's "Elastic Vapor: Life In the Cloud Blog" yesterday and he wrote an interesting piece on what is being coined "Fraud as a Service."  Basically, Reuven describes the rise of botnets as the origin of "cloud" based service utilities as chronicled from Uri Rivner's talk at RSA Europe:

I hate to tell you this, it wasn't Amazon, IBM or even Sun who invented
cloud computing. It was criminal technologists, mostly from eastern
Europe who did. Looking back to the late 90's and the use of
decentralized "warez" darknets. These original private "clouds" are the
first true cloud computing infrastructures seen in the wild. Even way
back then the criminal syndicates had developed "service oriented
architectures" and federated id systems including advanced encryption.
It has taken more then 10 years before we actually started to see this
type of sophisticated decentralization to start being adopted by
traditional enterprises

The one sentence that really clicked for me was the following:

In this new world order, cloud computing will not just be a requirement for scaling your data center but also protecting it.


One of the obvious benefits of cloud computing is the distribution of applications, services and information.  The natural by-product of this is additional resiliency from operational downtime caused by error or malicious activity.

This benefit is a also a forcing function; it will require new security methodologies and technology to allow the security (policies) to travel with the applications and data as well as enforce it.

I wrote about this concept back in 2007 as part of my predictions for 2008 and highlighted it again in a post titled: "Thinning the Herd and Chlorinating the Malware Gene Pool" based on some posts by Andy Jaquith:

Grid and distributed utility computing models will start to creep into security
really interesting by-product of the "cloud compute" model is that as
data, storage, networking, processing, etc. get distributed, so shall
security.  In the grid model, one doesn't care where the actions take
place so long as service levels are met and the experiential and
business requirements are delivered.  Security should be thought of in
exactly the same way. 

The notion that you can point to a
physical box and say it performs function 'X' is so last Tuesday.
Virtualization already tells us this.  So, imagine if your security
processing isn't performed by a monolithic appliance but instead is
contributed to in a self-organizing fashion wherein the entire
ecosystem (network, hosts, platforms, etc.) all contribute in the
identification of threats and vulnerabilities as well as function to
contain, quarantine and remediate policy exceptions.

of sounds like that "self-defending network" schpiel, but not focused
on the network and with common telemetry and distributed processing of
the problem.
Check out Red Lambda's cGrid technology for an interesting view of this model.

basically means that we should distribute the sampling, detection and
prevention functions across the entire networked ecosystem, not just to
dedicated security appliances; each of the end nodes should communicate
using a standard signaling and telemetry protocol so that common
threat, vulnerability and effective disposition can be communicated up
and downstream to one another and one or more management facilities.

It will be interesting to watch companies, established and emerging, grapple with this new world.