Search Results

Keyword: ‘private cloud’

On the Draft NIST Working Definition Of Cloud Computing…

May 8th, 2009 6 comments

How many of you have seen the Draft NIST Working Definition Of Cloud Computing?  It appears to have been presented to government CIO’s at the recent Federal CIO Cloud Computing Summit in Washington DC last week.

I saw the draft NIST Working Definition of Cloud Computing shown below (copied from Reuven Cohen’s blog) about a month and a half ago, but have not seen it presented in its entirety outside of the copy I was sent until now and didn’t know how/when it would be made “public,” so I didn’t blog directly about its content.

The reason I was happy to see it when I did was that I had just finished writing the draft of the Cloud Security Alliance Security Guidance for Critical Areas of Focus In Cloud Computing — specifically the section on Cloud architecture and found that there was a very good alignment between our two independent works (much like with the Jericho Cloud Cube model.)

In fact, you’ll see that I liked the definitions for the SPI model components so much, I used them and directly credited  Peter Mell from NIST, one of the authors of the work.

I sent a very early draft of my work along with some feedback to Peter on some of the definitions, specifically since I noted some things I did not fully agree with in the deployment models sections. The “community” clouds seem to me as being an abstraction or application of of private clouds. I have a “managed cloud” instead.  Ah, more fuel for good discussion.

I hoped we could have discussed them prior to publishing either of the documents, but we passed in the ether as it seems.

At any rate, here’s the draft from our wily Canadian friend:

4-24-09

Peter Mell and Tim Grance – National Institute of Standards and Technology, Information Technology Laboratory

Note 1: Cloud computing is still an evolving paradigm. Its definitions, use cases, underlying technologies, issues, risks, and benefits will be refined in a spirited debate by the public and private sectors. These definitions, attributes, and characteristics will evolve and change over time.

Note 2: The cloud computing industry represents a large ecosystem of many models, vendors, and market niches. This definition attempts to encompass all of the various cloud approaches.

Definition of Cloud Computing:

Cloud computing is a pay-per-use model for enabling available, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model promotes availability and is comprised of five key characteristics, three delivery models, and four deployment models.

Key Characteristics:

  • On-demand self-service. A consumer can unilaterally provision computing capabilities, such as server time and network storage, as needed without requiring human interaction with each service’s provider.
  • Ubiquitous network access. Capabilities are available over the network and accessed through standard mechanisms that promote use by heterogeneous thin or thick client platforms (e.g., mobile phones, laptops, and PDAs).
  • Location independent resource pooling. The provider’s computing resources are pooled to serve all consumers using a multi-tenant model, with different physical and virtual resources dynamically assigned and reassigned according to consumer demand. The customer generally has no control or knowledge over the exact location of the provided resources. Examples of resources include storage, processing, memory, network bandwidth, and virtual machines.
  • Rapid elasticity. Capabilities can be rapidly and elastically provisioned to quickly scale up and rapidly released to quickly scale down. To the consumer, the capabilities available for rent often appear to be infinite and can be purchased in any quantity at any time.
  • Pay per use. Capabilities are charged using a metered, fee-for-service, or advertising based billing model to promote optimization of resource use. Examples are measuring the storage, bandwidth, and computing resources consumed and charging for the number of active user accounts per month. Clouds within an organization accrue cost between business units and may or may not use actual currency.

Note: Cloud software takes full advantage of the cloud paradigm by being service oriented with a focus on statelessness, low coupling, modularity, and semantic interoperability.

Delivery Models:

  • Cloud Software as a Service (SaaS). The capability provided to the consumer is to use the provider’s applications running on a cloud infrastructure and accessible from various client devices through a thin client interface such as a Web browser (e.g., web-based email). The consumer does not manage or control the underlying cloud infrastructure, network, servers, operating systems, storage, or even individual application capabilities, with the possible exception of limited user-specific application configuration settings.
  • Cloud Platform as a Service (PaaS). The capability provided to the consumer is to deploy onto the cloud infrastructure consumer-created applications using programming languages and tools supported by the provider (e.g., java, python, .Net). The consumer does not manage or control the underlying cloud infrastructure, network, servers, operating systems, or storage, but the consumer has control over the deployed applications and possibly application hosting environment configurations.
  • Cloud Infrastructure as a Service (IaaS). The capability provided to the consumer is to rent processing, storage, networks, and other fundamental computing resources where the consumer is able to deploy and run arbitrary software, which can include operating systems and applications. The consumer does not manage or control the underlying cloud infrastructure but has control over operating systems, storage, deployed applications, and possibly select networking components (e.g., firewalls, load balancers).

Deployment Models:

  • Private cloud. The cloud infrastructure is owned or leased by a single organization and is operated solely for that organization.
  • Community cloud. The cloud infrastructure is shared by several organizations and supports a specific community that has shared concerns (e.g., mission, security requirements, policy, and compliance considerations).
  • Public cloud. The cloud infrastructure is owned by an organization selling cloud services to the general public or to a large industry group.
  • Hybrid cloud. The cloud infrastructure is a composition of two or more clouds (internal, community, or public) that remain unique entities but are bound together by standardized or proprietary technology that enables data and application portability (e.g., cloud bursting).

Each deployment model instance has one of two types: internal or external. Internal clouds reside within an organizations network security perimeter and external clouds reside outside the same perimeter.

Now, Reuven Cohen mentioned on his blog:

In creating this definition, NIST consulted extensively with the private sector including a wide range of vendors, consultants and industry pundants (sic!) including your (sic!) truly. Below is the draft NIST working definition of Cloud Computing. I should note, this definition is a work in progress and therefore is open to public ratification & comment. The initial feedback was very positive from the federal CIO’s who were presented it yesterday in DC. Baring any last minute lobbying I doubt we’ll see many more major revisions.

…which is interesting, because for being “…open to public ratification & comment,” I can’t seem to find it anywhere except for references to its creation as a deliverable in FY09 in a presentation from December, 2008.  I searched NIST’s site, but perhaps I’m just having a bad search day.

Clearly at least I have a couple of comments.  I could send them to Peter directly, but I’d rather discuss them openly if that’s appropriate and there is a forum to do so.  At this rate, it looks as though it may be too late, however.

/Hoff

Cloud Security Alliance Releases Initial Whitepaper At RSA Conference 2009

April 25th, 2009 No comments

Hopefully by now you’ve heard that the Cloud Security Alliance team released out initial efforts aimed at identifying key elements and practices in securing Cloud Computing.  Check the link below to download it.

There was a ton of work done in an extremely short timeframe.  There’s still a ton of work to be done. The 83 pages or so represent a good first-pass.  It’s not perfect and we didn’t aim for it to be so.  You’ll find things you may disagree with or think need clarification, please let us know.

As we break down these sections further, we really want people to get involved with subject matter expertise in each of the domains.  We want to take what we have an make it more valuable, more specific and more actionable.

We hope you’ll join us in this effort.

Cloud Security Alliance identifies key practices for secure adoption of Cloud Computing

San Francisco, CA, April 22, 2009 – The information security industry is taking on the task of providing guidance to enable secure Cloud Computing with today’s formal launch of the Cloud Security Alliance. The Cloud Security Alliance’s inaugural whitepaper, “Security Guidance for Critical Areas of Focus in Cloud Computing”, is now available on the Cloud Security Alliance website, and a presentation of the findings will be made at the RSA conference today at 2:45pm at Orange Room 312 in the Moscone Center.

The Cloud Security Alliance is a not-for-profit organization with a mission to promote the use of best practices for providing security assurance within Cloud Computing, and to provide education on the uses of Cloud Computing to help secure all other forms of computing. The founding thought leaders behind the formation of the Cloud Security Alliance are leading security practitioners from a range of private and public organizations and leading security companies PGP Corporation, Qualys, Inc. and Zscaler, Inc.

“Aggressive adoption of cloud computing is clearly underway. The convergence of inexpensive computing, pervasive mobility and virtualization technologies has created a platform for more agile and cost effective business applications and IT infrastructure,” said Jerry Archer, Chief Information Security Officer at Intuit, Inc. and part of the CISO leadership at the Cloud Security Alliance, “The cloud is forcing thoughtful adaptation of certain security controls, while creating an even greater demand for best practices in security program governance.”

The whitepaper being presented at RSA, “Security Guidance for Critical Areas of Focus in Cloud Computing”, outlines key issues and provides advice for both Cloud Computing customers and providers within 15 strategic domains. According to Alliance co-founders Nils Puhlmann and Jim Reavis, the several months of collaboration was worth the effort, “We would like to thank the many contributors to this initial effort. The great diversity of services offered via cloud computing requires careful analysis to understand the risks and mitigation appropriate in each case. At the same time, we see enormous potential for the cloud model to eventually simplify many difficult security problems. This initial deliverable is just the beginning of our efforts, and we would like to extend an open invitation to industry experts to help us create additional best practices for practitioners and the industry.”

The Cloud Security Alliance is building its guidance by engaging with experts from a variety of backgrounds to reflect the many organizational participants that will be involved in cloud computing decisions. Joshua Davis, Director of Information Security & Compliance at Qualcomm and a member of the Cloud Security Alliance, sees this collaboration as timely. “The information risk management factors one must consider when leveraging cloud computing, especially legal and regulatory compliance issues, represent unchartered territory for many enterprises. The Cloud Security Alliance is bringing together information security and legal experts, along with many other domains of knowledge, to see these issues from every stakeholder’s point of view.”

The guidance whitepaper is available online at www.cloudsecurityalliance.org/guidance. Open discussion is welcome at our LinkedIn group and on Twitter at #cloudsa.

About Cloud Security Alliance

The Cloud Security Alliance is a not-for-profit organization with a mission to promote the use of best practices for providing security assurance within Cloud Computing, and to provide education on the uses of Cloud Computing to help secure all other forms of computing. The Cloud Security Alliance is led by industry practitioners and supported by founding charter companies PGP Corporation, Qualys, Inc. and Zscaler, Inc. For further information, the Cloud Security Alliance website is www.cloudsecurityalliance.org

/Hoff

Jericho Forum’s Cloud Cube Model…Rubik, Rubric and Righteous!

April 16th, 2009 No comments

I’m looking forward to the RSA conference this year; I am going to get to discuss Virtualization and Cloud Computing security a lot.

One of the events I’m really looking forward to is a panel discussion at the Jericho Forum’s event (Wednesday the 22nd, starting at 3pm) with some really good friends of mine and members of the Forum.

We’re going to be discussing Jericho’s Cloud Cube Model:

jericho-cloudcubeI think that the Cloud Cube does a nice job describing the multi-dimensional elements of Cloud Computing and frames not only Cloud use cases, but also how they are deployed and utilized.

Here’s why I am excited; if you look at the Cube above and the table I built below in this blog (The Vagaries Of Cloudcabulary: Why Public, Private, Internal & External Definitions Don’t Work…) to get deeper into Cloud definitions, despite the differences in names, you will notice some remarkable similarities, especially the notion of how “internal/external” is called out separately from “perimertized/de-perimeterized.” This is akin to my table labeling of “Infrastructure located” and “managed by” column headings.  Further the “outsourced/insourced” maps to my “managed by:” column.  I like the “proprietary/open” dimension, also, which I didn’t include in my table, but I did reference in my Frogs presentation. I think I’ll extend the table to show that, also.

hppiev7

I am very much looking forward to discussing this on the panel.  I’ve been preaching about the Jericho Forum since my religious conversion many years ago.

As I said in my Frogs preso, Cloud Computing is the evolution of the “re-perimeterization” model on steroids.

/Hoff

Incomplete Thought: Looking At An “Open & Interoperable Cloud” Through Azure-Colored Glasses

March 29th, 2009 4 comments

As with the others in my series of “incomplete thoughts,” this one is focused on an issue that has been banging around in my skull for a few days.  I’m not sure how to properly articulate my thought completely, so I throw this up for consideration, looking for your discussion to clarify my thinking.

You may have heard of the little dust-up involving Microsoft and the folk(s) behind the Open Cloud Manifesto. The drama here reminds me of the Dallas episode where everyone tried to guess who shot J.R., and it’s really not the focus of this post.  I use it here for color.

What is the focus of this post is the notion of “open(ness),” portability and interoperability as it relates to Cloud Computing — or more specifically how these terms relate to the infrastructure and enabling platforms of Cloud Computing solution providers.

I put “openness” in quotes because definitionally, there are as many representations of this term as there are for “Cloud,” which is a big part of the problem.  Just to be fair, before you start thinking I’m unduly picking on Microsoft, I’m not. I challenged VMware on the same issues.

So here’s my question as it relates to Microsoft’s strategy regarding Azure given an excerpt from Microsoft’s Steven Martin as he described his employer’s stance on Cloud in regard to the Cloudifesto debacle above in his blog post titled “Moving Toward an Open Process On Cloud Computing Interoperability“:

From the moment we kicked off our cloud computing effort, openness and interop stood at the forefront. As those who are using it will tell you, the  Azure Services Platform is an open and flexible platform that is defined by web addressability, SOAP, XML, and REST.  Our vision in taking this approach was to ensure that the programming model was extensible and that the individual services could be used in conjunction with applications and infrastructure that ran on both Microsoft and non-Microsoft stacks. 

What got me going was this ZDNet interview by Mary Jo Foley wherein she interviewed Julius Sinkevicius, Microsoft’s Director of Product Management for Windows Server, in which she loosely references/compares Cisco’s Cloud strategy is to Microsoft’s and apparently a lack of interoperability between Microsoft’s own virtualization and Cloud Computing platforms:

MJF: Did Cisco ask Microsoft about licensing Azure? Will Microsoft license all of the components of Azure to any other company?

Sinkevicius: No, Microsoft is not offering Windows Azure for on premise deployment. Windows Azure runs only in Microsoft datacenters. Enterprise customers who wish to deploy a highly scalable and flexible OS in their datacenter should leverage Hyper-V and license Windows Server Datacenter Edition, which has unlimited virtualization rights, and System Center for management.

MJF: What does Microsoft see as the difference between Red Dog (Windows Azure) and the OS stack that Cisco announced?

Sinkevicius: Windows Azure is Microsoft’s runtime designed specifically for the Microsoft datacenter. Windows Azure is designed for new applications and allows ISVs and Enterprises to get geo-scale without geo-cost.  The OS stack that Cisco announced is for customers who wish to deploy on-premise servers, and thus leverages Windows Server Datacenter and System Center.

The source of the on-premise Azure hosting confusion appears to be this: All apps developed for Azure will be able to run on Windows Server, according to the Softies. However — at present — the inverse is not true: Existing Windows Server apps ultimately may be able to run on Azure. For now only some can do so, and only with a fairly substantial amount of tweaking.

Microsoft’s cloud pitch to enterprises who are skittish about putting their data in the Microsoft basket isn’t “We’ll let you host your own data using our cloud platform.” Instead, it’s more like: “You can take some/all of your data out of our datacenters and run it on-premise if/when you want — and you can do the reverse and put some/all of your data in our cloud if you so desire.”

What confuses me is how Azure, as a platform, will be limited to deployment only in Microsoft’s operating environment (i.e. their datacenters) and not for use outside of that environment and how that compares to the statements above regarding the interoperability described by Martin.

Doesn’t the proprietary nature of the Azure runtime platform, “open” or not via API, by definition limit its openness and interoperability? If I can’t take my applications and information and operate it anywhere without major retooling, how does that imply openness, portability and interoperability?  

If one cannot do that fully between Windows Server and Azure — both from the same company —  what chance do we have between instances running across different platforms not from Microsoft?

The issue at hand to consider is this:

If you do not have one-to-one parity between the infrastructure that provides your cloud “externally” versus “internally,” (and similarly public versus private clouds) can you truly claim openness, portability and interoperability?

What do you think?

More On Clouds & Botnets: MeatClouds, CloudFlux, LeapFrog, EDoS and More!

March 13th, 2009 6 comments

After my "Frogs" talk at Source Boston yesterday, Adam O'Donnell and I chatted about one of my chuckle slides I threw up in the presentation in which I give some new names to some (perhaps not new) attack/threat scenarios which involve Cloud Computing:

CloudSecBingo.058

  • MeatCloud - Essentially abusing Amazon's Mechanical Turk and using it to produce the Cloud version of a sweat shop; exploiting the ignorant for fun and profit to perform menial illegal muling tasks on your behalf…think SETI meets underage garment workers…
  • CloudFlux – Take a mess of stolen credit cards, open up  a slew of Amazon AWS accounts using them, build/scale to thousands of instances overnight, launch carpet bomb attack (you choose,) tear it down/have it torn down, and move your botnet elsewhere…rinse, lather, repeat…
  • LeapFrog – As we move to hybrid private/public clouds and load balancing/cloudbursting across multiple cloud providers, we'll interconnect Clouds via VPNs to the "trusted internals" of your Cloudbase… Attackers will thank us by abusing these tunnels to penetrate your assets through the, uh, back door.
  • vMotion Poison Potion – When VMware's vCloud makes its appearance and we start to allow vMotion across datacenters and across Clouds (in the clear?,) imagine the fun we'll have as we see attacks against vMotion protocols and VM state…  
  • EDoS – Economic Denial of Sustainability – Covered previously here

Adam mentioned that I might have considered that Botnets were a great example of a Cloud-based service and wrote a very cool piece about it on ZDNet here.

I remembered after the fact that I wrote a related blog on the topic several months ago titled "Cloud Computing: Invented by Criminals, Secured by ???" as a rif on something Reuven Cohen wrote.

/Hoff
Categories: Cloud Computing, Cloud Security Tags:

Ron Popeil and Cloud Computing In Poetic Review…

February 27th, 2009 No comments

Popeil

The uptake of computing
using the cloud,
would make the king of all marketeers
— Ron Popeil — proud

He's the guy who came out
with the canned spray on hair,
the oven you set and forget
without care

He had the bass fishing rod
you could fit in your pocket,
the Veg-O-Matic appliance
with which you could chop it

Mr. Microphone, it seems, 
was ahead of its time
Karaoke meets Facebook
Oh, how divine!

The smokeless ashtray,
the Cap Snaffler, drain buster
selling you all of the crap
Infomercials could muster

His inventions solved problems
some common, some new
If you ordered them quickly
he might send you two!

Back to the Cloud
and how it's related
to the many wonders
that Sir Ron has created

The cloud fulfills promises
that IT has made:
agility, better service
at a lower pay grade

You can scale up, scale down
pay for just what you use
Elastic infrastructure
what you get's what you choose

We've got public and private,

outside and in,

on-premise, off-premise

thick platforms or thin

The offerings are flooding
the wires en masse
Everything, it now seems,
is some sort of *aaS

You've got infrastructure,
platforms, software and storage.
Integration, SOA 
with full vendor whoreage

Some folks equate
virtualization with cloud
The platform providers
shout this vision out loud

'Course the OS contingent
has something to say
that cloud and virt
is part of their play

However you see it,
and whatever its form
the Cloud's getting bigger
it's starting to storm

Raining down on us all
is computational glory
but I wonder, dear friends,
'bout the end of this story

Will the Cloud truly bring value?
Solve problems that matter?
Or is it about 
vendors' wallets a-fatter?

*I* think the Cloud
has wonderful promise
If the low-hanging IT fruit
can be lifted 'way from us

The Cloud is a function
that's forging new thought
Pushing the boundaries
and theories we've bought

It's profoundly game changing

and as long as we focus

and don't buy into the 

hyped hocus pocus

So before we end up
with a Cloud that "slices and dices"
that never gets dull,
mashes, grates, grinds and rices

It's important to state

what problem we're solving

so the Cloud doesn't end up

with its value de-evolving

—-

BTW, if you want to see more of my Cloud and Security poems, just check here.

Interesting Read: The World Privacy Forum’s Cloud Privacy Report

February 25th, 2009 No comments

The World Privacy Forum released their "Cloud Privacy Report" written by Robert Gellman two days ago. It's an interesting read that describes the many facets of data privacy concerns in Cloud environments: 

This report discusses the issue of cloud computing and outlines its implications for the privacy of 
personal information as well as its implications for the confidentiality of business and 
governmental information. The report finds that for some information and for some business 
users, sharing may be illegal, may be limited in some ways, or may affect the status or 
protections of the information shared. The report discusses how even when no laws or 
obligations block the ability of a user to disclose information to a cloud provider, disclosure may 
still not be free of consequences. The report finds that information stored by a business or an 
individual with a third party may have fewer or weaker privacy or other protections than 
information in the possession of the creator of the information. The report, in its analysis and 
discussion of relevant laws, finds that both government agencies and private litigants may be 
able to obtain information from a third party more easily than from the creator of the 
information. A cloud provider’s terms of service, privacy policy, and location may significantly 
affect a user’s privacy and confidentiality interests.


I plan to spend some time reading through the report in more depth, but I enjoyed my cursory review thus far, especially some of the coverage related to issues such as FCRA, bankruptcy, Cloud provider ownership, disclosure, etc.  Many of these issues are near and dear to my heart.

You can download the report here.

/Hoff
Categories: Cloud Computing, Cloud Security, Privacy Tags:

Berkeley RAD Lab Cloud Computing Paper: Above the Clouds or In the Sand?

February 19th, 2009 2 comments

Cal
I've waffled on how, or even if, I would write my critique of the Berkeley RAD Lab's paper titled "Above the Clouds: A Berkeley View of Cloud Computing.

I think I've had a hard time deciding where the authors have their heads, hence the title.

Those of you who know me are probably chuckling at the fact that I was a good boy and left off the potential third cranial location option…

Many people have written their respective reviews of the work including James UrquhartDavid Linthicum and Chuck Hollis who all did a nice job summarizing various perspectives.

I decided to add my $0.02 because it occurred to me that despite several issues I have with the paper, two things really haven't been appropriately discussed:
  1. The audience for the paper
  2. Expectations of the reader 

The goals of the paper were fairly well spelled out and within context of what was written, the authors achieved many of them.

Given that it was described as a "view" of Cloud Computing and not the definitive work on the subject, I think perhaps the baby has been unfairly thrown out with the bath water even when balanced with the "danger" that the general public or press may treat it as gospel.

I think the reason there has been so much frothy reaction to this paper by the "Cloud community" is that because the paper comes from the Electrical Engineering/Computer Science department of UC Berkeley, a certain level of technical depth and a more holistic (dare I say empirical) model for analysis is expected by many readers and their expectations are therefore set a certain way.  

Most of the reviews that might be perceived as negative are coming from folks who are reasonably technical, of which I am one.

To that point and that of item #1 above, I don't feel that "we" are the intended audience for this paper and thus, to point #2 above, our expectations — despite the goals of the paper — were not met.

That being said, I do have issues with the authors' definition of cloud computing as unnecessarily obtuse, their refusal to discuss the differences between the de facto SPI model and its variants is annoying and short-sighted, and their dismissal of private clouds as relevant is quite disturbing.  The notion that Cloud Computing must be "external" to an enterprise and use the Internet as a transport is simply delusional. 

Eschewing de facto models of reference because the authors could not agree amongst themselves on the differences between them — despite consensus in industry outside of academia and even models like the one I've been working on — comes across as myopic and insulated.  

Ultimately I think the biggest miss of the paper was the fact that they did not successfully answer "What is Cloud Computing and how is it different from previous paradigm shifts such as Software as a Service (SaaS)?"  In fact, I came away from the paper with the feeling that Cloud Computing is SaaS…

However, I found the coverage of the business drivers, economic issues and the top 10 obstacles to be very good and that people unfamiliar with Cloud Computing would come away with a better understanding — not necessarily complete — of the topic.

It was an interesting read that is complimentary to much of the other work going on right now in the field.  I think we should treat it as such and move on.

/Hoff
Categories: Uncategorized Tags:

CloudSQL – Accessing Datastores in the Sky using SQL…

December 2nd, 2008 5 comments
Zohosql-angled
I think this is definitely a precursor of things to come and introduces some really interesting security discussions to be had regarding the portability, privacy and security of datastores in the cloud.

Have you heard of Zoho?  No?  Zoho is a SaaS vendor that describe themselves thusly:

Zoho is a suite of online applications (services) that you sign up for and access from our Website. The applications are free for individuals and some have a subscription fee for organizations. Our vision is to provide our customers (individuals, students, educators, non-profits, small and medium sized businesses) with the most comprehensive set of applications available anywhere (breadth); and for those applications to have enough features (depth) to make your user experience worthwhile.

Today, Zoho announced the availability of CloudSQL which is middleware that allows customers who use Zoho's SaaS apps to "…access their data on Zoho SaaS
applications using SQL queries."
 

From their announcement:

Zoho CloudSQL is a technology that allows developers to interact with business data stored across Zoho Services using the familiar SQL language. In addition, JDBC and ODBC database drivers make writing code a snap – just use the language construct and syntax you would use with a local database instance. Using the latest Web technology no longer requires throwing away years of coding and learning.

Zoho CloudSQL allows businesses to connect and integrate the data and applications they have in Zoho with the data and applications they have in house, or even with other SaaS services. Unlike other methods for accessing data in the cloud, CloudSQL capitalizes on enterprise developers’ years of knowledge and experience with the widely‐used SQL language. This leads to faster deployments and easier (read: less expensive) integration projects.

Basically, CloudSQL is interposed between the suite of Zoho applications and the backend datastores and functions as an intermediary receiving SQL queries against the pooled data sets using standard SQL commands and dialects. Click on the diagram below for a better idea of what this looks like.

Zoho-cloud-sql
What's really interesting about allowing native SQL access is the ability to then allow much easier information interchange between apps/databases on an enterprises' "private cloud(s)" and the Zoho "public" cloud.

Further, it means that your data is more "portable" as it can be backed up, accessed, and processed by applications other than Zoho's.  Imagine if they were to extend the SQL exposure to other cloud/SaaS providers…this is where it will get really juicy. 

This sort of thing *will* happen.  Customers will see the absolute utility of exposing their cloud-based datastores and sharing them amongst business partners, much in the spirit of how it's done today, but with the datastores (or chunks of them) located off-premises.

That's all good and exciting, but obviously security questions/concerns immediately surface regarding such things as: authentication, encryption, access control, input sanitation, privacy and compliance…

Today our datastores typically live inside the fortress with multiple
layers of security and proxied access from applications, shielded from
direct access and yet we still have basic issues with attacks such as
SQL injection.  Imagine how much fun we can have with this!

The best I could find regarding security and Zoho came from their FAQ which doesn't exactly inspire confidence given the fact that they address logical/software security by suggesting that anti-virus software is the best line of defense ffor protecting your data and that "data encryption" will soon be offered as an "option" and (implied) SSL will make you secure:

6. Is my data secured?

Many people ask us this question. And rightly so; Zoho has invested alot of time and money to ensure that your information is secure and private. We offer security on multiple levels including the physical, software and people/process levels; In fact your data is more secure than walking around with it on a laptop or even on your corporate desktops.

Physical: Zoho servers and infrastructure are located in the most secure types of data centers that have multiple levels of restrictions for access including: on-premise security guards, security cameras, biometric limited access systems, and no signage to indicate where the buildings are, bullet proof glass, earthquake ratings, etc.

Hardware: Zoho employs state of the art firewall protection on multiple levels eliminating the possibility of intrusion from outside attacks

Logical/software protection: Zoho deploys anti-virus software and scans all access 24 x7 for suspicious traffic and viruses or even inside attacks; All of this is managed and logged for auditing purposes.

Process: Very few Zoho staff have access to either the physical or logical levels of our infrastructure. Your data is therefore secure from inside access; Zoho performs regular vulnerability testing and is constantly enhancing its security at all levels. All data is backed up on multiple servers in multiple locations on a daily basis. This means that in the worst case, if one data center was compromised, your data could be restored from other locations with minimal disruption. We are also working on adding even more security; for example, you will soon be able to select a "data encryption" option to encrypt your data en route to our servers, so that in the unlikely event of your own computer getting hacked while using Zoho, your documents could be securely encrypted and inaccessible without a "certificate" which generally resides on the server away from your computer.

Fun times ahead, folks.

/Hoff

Cloud Computing: Invented By Criminals, Secured By ???

November 3rd, 2008 10 comments

I was reading Reuven Cohen's "Elastic Vapor: Life In the Cloud Blog" yesterday and he wrote an interesting piece on what is being coined "Fraud as a Service."  Basically, Reuven describes the rise of botnets as the origin of "cloud" based service utilities as chronicled from Uri Rivner's talk at RSA Europe:

I hate to tell you this, it wasn't Amazon, IBM or even Sun who invented
cloud computing. It was criminal technologists, mostly from eastern
Europe who did. Looking back to the late 90's and the use of
decentralized "warez" darknets. These original private "clouds" are the
first true cloud computing infrastructures seen in the wild. Even way
back then the criminal syndicates had developed "service oriented
architectures" and federated id systems including advanced encryption.
It has taken more then 10 years before we actually started to see this
type of sophisticated decentralization to start being adopted by
traditional enterprises
.

The one sentence that really clicked for me was the following:

In this new world order, cloud computing will not just be a requirement for scaling your data center but also protecting it.

Amen. 

One of the obvious benefits of cloud computing is the distribution of applications, services and information.  The natural by-product of this is additional resiliency from operational downtime caused by error or malicious activity.

This benefit is a also a forcing function; it will require new security methodologies and technology to allow the security (policies) to travel with the applications and data as well as enforce it.

I wrote about this concept back in 2007 as part of my predictions for 2008 and highlighted it again in a post titled: "Thinning the Herd and Chlorinating the Malware Gene Pool" based on some posts by Andy Jaquith:

Grid and distributed utility computing models will start to creep into security
A
really interesting by-product of the "cloud compute" model is that as
data, storage, networking, processing, etc. get distributed, so shall
security.  In the grid model, one doesn't care where the actions take
place so long as service levels are met and the experiential and
business requirements are delivered.  Security should be thought of in
exactly the same way. 

The notion that you can point to a
physical box and say it performs function 'X' is so last Tuesday.
Virtualization already tells us this.  So, imagine if your security
processing isn't performed by a monolithic appliance but instead is
contributed to in a self-organizing fashion wherein the entire
ecosystem (network, hosts, platforms, etc.) all contribute in the
identification of threats and vulnerabilities as well as function to
contain, quarantine and remediate policy exceptions.

Sort
of sounds like that "self-defending network" schpiel, but not focused
on the network and with common telemetry and distributed processing of
the problem.
Check out Red Lambda's cGrid technology for an interesting view of this model.

This
basically means that we should distribute the sampling, detection and
prevention functions across the entire networked ecosystem, not just to
dedicated security appliances; each of the end nodes should communicate
using a standard signaling and telemetry protocol so that common
threat, vulnerability and effective disposition can be communicated up
and downstream to one another and one or more management facilities.

It will be interesting to watch companies, established and emerging, grapple with this new world.

/Hoff