Thin Clients: Does This Laptop Make My Ass(ets) Look Fat?

Phatburger_2
Juicy Fat Assets, Ripe For the Picking…

So here’s an interesting spin on de/re-perimeterization…if people think we cannot achieve and cannot afford to wait for secure operating systems, secure protocols and self-defending information-centric environments but need to "secure" their environments today, I have a simple question supported by a simple equation for illustration:

For the majority of mobile and internal users in a typical corporation who use the basic set of applications:

  1. Assume a company that:
    …fits within the 90% of those who still have data centers, isn’t completely outsourced/off-shored for IT and supports a remote workforce that uses Microsoft OS and the usual suspect applications and doesn’t plan on utilizing distributed grid computing and widespread third-party SaaS
  2. Take the following:
    Data Breaches.  Lost Laptops.  Non-sanitized corporate hard drives on eBay.  Malware.  Non-compliant asset configurations.  Patching woes.  Hardware failures.  Device Failure.  Remote Backup issues.  Endpoint Security Software Sprawl.  Skyrocketing security/compliance costs.  Lost Customer Confidence.  Fines.  Lost Revenue.  Reduced budget.
  3. Combine With:
    Cheap Bandwidth.  Lots of types of bandwidth/access modalities.  Centralized Applications and Data. Any Web-enabled Computing Platform.  SSL VPN.  Virtualization.  Centralized Encryption at Rest.  IAM.  DLP/CMP.  Lots of choices to provide thin-client/streaming desktop capability.  Offline-capable Web Apps.
  4. Shake Well, Re-allocate Funding, Streamline Operations and "Security"…
  5. You Get:
    Less Risk.  Less Cost.  Better Control Over Data.  More "Secure" Operations.  Better Resilience.  Assurance of Information.  Simplified Operations. Easier Backup.  One Version of the Truth (data.)

I really just don’t get why we continue to deploy and are forced to support remote platforms we can’t protect, allow our data to inhabit islands we can’t control and at the same time admit the inevitability of disaster while continuing to spend our money on solutions that can’t possibly solve the problems.

If we’re going to be information centric, we should take the first rational and reasonable steps toward doing so. Until the operating systems are more secure, the data can self-describe and cause the compute and network stacks to "self-defend," why do we continue to focus on the endpoint which is a waste of time.

If we can isolate and reduce the number of avenues of access to data and leverage dumb presentation platforms to do it, why aren’t we?

…I mean besides the fact that an entire industry has been leeching off this mess for decades…


I’ll Gladly Pay You Tuesday For A Secure Solution Today…

The technology exists TODAY to centralize the bulk of our most important assets and allow our workforce to accomplish their goals and the business to function just as well (perhaps better) without the need for data to actually "leave" the data centers in whose security we have already invested so much money.

Many people are doing that with the servers already with the adoption of virtualization.  Now they need to do with their clients.

The only reason we’re now going absolutely stupid and spending money on securing endpoints in their current state is because we’re CAUSING (not just allowing) data to leave our enclaves.  In fact with all this blabla2.0 hype, we’ve convinced ourselves we must.

Hogwash.  I’ve posted on the consumerization of IT where companies are allowing their employees to use their own compute platforms.  How do you think many of them do this?

Relax, Dude…Keep Your Firewalls…

In the case of centralized computing and streamed desktops to dumb/thin clients, the "perimeter" still includes our data centers and security castles/moats, but also encapsulates a streamed, virtualized, encrypted, and authenticated thin-client session bubble.  Instead of worrying about the endpoint, it’s nothing more than a flickering display with a keyboard/mouse.

Let your kid use Limewire.  Let Uncle Bob surf pr0n.  Let wifey download spyware.  If my data and applications don’t live on the machine and all the clicks/mouseys are just screen updates, what do I care?

Yup, you can still use a screen scraper or a camera phone to use data inappropriately, but this is where balancing risk comes into play.  Let’s keep the discussion within the 80% of reasonable factored arguments.  We’ll never eliminate 100% and we don’t have to in order to be successful.

Sure, there are exceptions and corner cases where data *does* need to leave our embrace, but we can eliminate an entire class of problem if we take advantage of what we have today and stop this endpoint madness.

This goes for internal corporate users who are chained to their desks and not just mobile users.

What’s preventing you from doing this today?

/Hoff

  1. January 10th, 2008 at 15:44 | #1

    What's preventing me from doing this today?
    A. The fact that the viewing has to be done through a browser that (1) caches information locally and (2) is vulnerable to shitloads of attacks.
    B. The fact that I have hundreds of thousands of external customers who need access to our sensitive data too, and I can't control their endpoints or provide them new ones. (I can't even buy them SecurID tokens.)
    C. The fact that humans still want to possess data. They want to fondle it, take it home on a USB drive and call it George. No matter how much bandwidth you promise them, they still want to take the data home with them. I even caught my frickin' BOSS doing it.
    I'm with you, Hoffster: I want to give everyone an adm3a and send them on their merry way.
    Failing that, I'd love to give them an encrypted, virtualized PC on a USB drive and make it disable everything else on the platform they plug it into while it's running. Can you get me about 1000 of those at gummint prices, please?

  2. January 10th, 2008 at 16:30 | #2

    @Shrdlu:
    To your points:
    A) I'll concede the browsers in general are vulnerable to all sorts of things when combined with user's bad behavior, but TONS of client-side apps leave residue (including local caches — and you know you can force cache clearing in browsers, anyway).
    You can implement cost-effectively multi-factor authentication that requires no s/w, no h/w and is resistant to replay to counter things like authentication issues and if the only data sent over the wire is graphical screen updates, then you're not caching anything of worth.
    This problem has been addressed in multiple products.
    Also, I'm going to put a post up regarding some of the investigation I've been doing with site specific browsers based on Andy Jacquith's work or late which could also go towards a more secure browser platform.
    B) As I said, this argument was purposefully restricted to corporate users. I totally understand that there are companies that have relationships with folks that need access to the data, and while that's SUPER important, I was talking about improving the things you *can* improve.
    I'm not suggesting closing one's eyes to the challenge you describe, but you're probably not in that 90th percentile of companies that don't have that constraint.
    C) I have no argument there. Some corporate cultures can deal with this based upon the info. they traffic in, some can't. I'm just shocked that this solution isn't deployed more instead of this rampant end-node proliferation crap.
    As to your (unnamed) point D — I think we're getting closer to that dream. There are U3 drives that do an awfully good job of providing a more secure, self-contained environment.
    Good feedback.
    Ta.
    /Hoff

  3. Andrew Yeomans
    January 11th, 2008 at 04:14 | #3

    What's preventing me from doing this today?
    a) Capital costs of setting up Citrix servers, converting apps to run correctly on them, and then paying for all the CALs means this appears more expensive than running thick clients. Especially as there is a feeling that the thick clients can do more than thin ones.
    Maybe this will change with lower cost hardware. I was impressed that my Asus Eee (costing £187 + tax) could connect out-the-box to Citrix remote access servers, no additional software required. Compare with a default Windows installation which needs additional software.
    b) The user experience of running thin apps is usually different and more clumsy than local apps.
    Perhaps because many people have not had experience of running a Unix/Linux X Windows desktop, with fully network-transparent applications, so don't know what is possible. Incidentally, X Windows can help overcome the objection that people like power in their hands; since you can offer super-computer power at the click of an icon. And the speed is often terrific as there's a high bandwidth link between the data and the application processor.
    c) The need for local facilities such as printing, data transfer, multi-media. Can be overcome, but still a bit clunky.
    d) Off-line working ability. Or the perceived need to do it. But as Blackberries/smartphones become pervasive, coupled with the Hotmail/Gmail generation coming into decision-making positions, maybe people won't feel the need to have local email. [One day soon I'll migrate my POP3 local email to IMAP4, and cut the need for local storage.]
    e) Lack of "it just works" and works fast enough in mobile systems. I'm still looking for that secure U3/PortableApps/cut-down Linux+browser USB stick that gives a secure full-screen thin-client environment. Picking up network connection details and hardware info from whatever was on the hard disk, auto-selection of screen resolution, optional selection of qemu to run as VM or on PowerPC mac hardware, no need to boot a 200MB image, etc. Suspect I'll have to roll my own.
    I'm with you too, Chris. We don't need to worry about the data that never escapes. But how can we overcome that whole industry leeching off the mess?

  4. Arthur
    January 11th, 2008 at 05:53 | #4

    My previous employer used thin clients initially for centralizing development onto reusable virtual servers and then for outsourcing as well. We used MS Virtual Server and the standard terminal services client and all in all it was awesome. It did take a bit of getting used to for folks, but the cost savings were huge, in terms of support. Patching, general operations and asset management were all much easier and costs plummeted. Plus our risk of data leakage (both accidental and intentional) went way down. I wish we could have done more of it.

  5. January 11th, 2008 at 06:09 | #5

    @Arthur:
    What were some of the difficulties/challenges?
    /Hoff

  6. Rich Moffitt
    January 11th, 2008 at 07:33 | #6

    I liked this post — I'm a big fan of remote desktop sessions to get work done, especially when my laptop doesn't have the power to work on large amounts of data anyway.
    At the same time, I'd have to agree with Andrew's points. Remote application sharing works well, but not for everything. I think we're in a way "spoiled" by having the data stored on powerful endpoint machines. Sure, it can make data management and de-duplication a major pain if things aren't planned right, but people seem to deal with that sort of thing more easily than tolerating UI refresh times.
    @shrdlu: With regard to point B, I guess it really matters what kind of access they need to that data. You could possibly assign them SecurID software tokens and expire them on the server, and provide a web / remote application interface to view and edit data. And even then it might be much harder to work with than having the actual files. Plus, setting that up means more effort and up-front cost, and you're still exposing sensitive data in some form or another to systems out of your control.
    By the way, how did we come to the assumption that 90% of companies don't need to share sensitive documents? (I really don't know what the right number would be, but sounds a bit optimistic to me.)
    Regards,
    an ex-lurker 🙂

  7. January 11th, 2008 at 09:36 | #7

    @Rich aka erstwhile lurker 😉
    In my case, the customers sometimes need to browse all sensitive data (not just their own), and upload sensitive data as well as download it into local applications. And I can't afford to buy SecurID for that many people anyway. 🙁

  8. January 11th, 2008 at 10:08 | #8

    @Rich / @Shrdlu:
    I didn't mean to suggest that 90% of corporations don't need to share data…that's not what the initial post said. If that's how it came across, I should clarify that better.
    I'm also curious as to why/how we only seem to be talking about Citrix here.
    Nobody hear of VDI? http://www.vmware.com/products/vdi/
    VDI (and like technology solutions from other vendors) represents the intersection of the datacenter virtualization of servers and extends that capabilities to the client/presentation layer…
    …more coming.
    /Hoff

  9. January 11th, 2008 at 11:25 | #9

    Great post and good comments and feedback.
    This is the single reason that Citrix is over a billion dollar company right now. The economics of centralization are very very compelling. Combine that with a more efficient data center via virtualization and you get a positive double-whammy.
    That being said, there is a lot of inertia for keeping data on laptops. A LOT of inertia. It's hard to get people to think differently about how to architect their applications. So they don't.
    There is also a bandwidth problem in a lot of place (including the US). EVDO is getting more prevalent, but there are still places where connectivity sucks. And that can impact business.
    But this is definitely the trend. Long live terminal-host!

  10. Arthur
    January 11th, 2008 at 12:15 | #10

    @Hoff,
    Our biggest challenge? The developers. They were very used to having all the data on their local machines and were convinced they couldn't possibly work as effectively on remote machines. So we had lots of push back from the engineering organization. So we put together a pilot group and showed them how to use their thin client software in full screen mode and that the machines they were remote accessing were in fact faster and had more memory/disk available. They also like the fact that they got serviced faster when there were issues since there was 24/7 support staff on-site at the data center. Pretty soon groups were coming to IT asking to be moved to thin clients.
    We used the same strategy to move folks to virtual servers. When they found out we could provision a new virtual machine in an hour versus a new physical machine which would take days to weeks depending on if new hardware needed to be acquired, it didn't take much for people to switch.

  11. January 14th, 2008 at 07:26 | #11

    We're already moving a good percentage of our workforce over to Terminal Service clients. We're totally sick of having data floating everywhere, which really consists of copies of files on various drives and devices. If a file chock full of information is discrete, someone will want and refer to it as the discrete file, not as ephemeral groupings of data that reside on a server only. The short version as mentioned above: people take files home and copy them.
    This follows because customer-to-vendor information almost has to be in a discrete file or database for transfer over. If it's discrete once, it'll always have a discrete copy.
    We're implementing DLP, laptop encryption, and other technologies to really protect the real problem: data inertia. And rather than rely on people to change (they won't), we're forcing the server/client model by moving people over to it as much as possible so that we can devalue all those mobile and endpoint devices.

  1. No trackbacks yet.