Home > Punditry > Breaking News: Successful SCADA Attack Confirmed – Mogull Is pwned!

Breaking News: Successful SCADA Attack Confirmed – Mogull Is pwned!

December 13th, 2007 Leave a comment Go to comments

Scada
A couple of weeks ago, right after I wrote my two sets of 2008 (in)security predictions (here and here), Mogull informed me that he was penning an article for Dark Reading on how security predictions are useless.  He even sent me a rough draft to rub it in.

His Dark Reading article is titled "The Perils of Predictions – and Predicting Peril" which you can read here.  The part I liked best was, of course, the multiple mentions that some idiot was going to predict an attack on SCADA infrastructure:


Oh, and there is one specific prediction I’ll make for next year:
Someone will predict a successful SCADA attack, and it won’t happen.
Until it does.

So, I’m obviously guilty as charged.  Yup, I predicted it.  Yup, I think it will happen.

In fact, it already has…

You see, Mogull is a huge geek and has invested large sums of money in his new home and outfitted it with a complete home automation system.  In reality, this home automation system is basically just a scaled down version of a SCADA system (Supervisory Control and Data Acquisition.)  Controlling sensors and integrating telemetry with centralized reporting and control…

Rich and I are always IM’ing and emailing one another, so a few days ago before Rich left town for an international junket, I sent him a little email asking him to review something I was working on.  The email contained a link to my "trusted" website.

The page I sent him to was actually trojaned with the 0day POC code for the QT RTSP vulnerability from a couple of weeks ago.  I guess Rich’s Leopard ipfw rules need to be modified because right after he opened it, the trojan executed and then phoned home (to me) and I was able to open a remote shell on TCP/554 right to his Mac which incidentally controls his home automation system.  I totally pwn his house.

CctvSo a couple of days ago, Rich went out of town and I waited patiently for the DR article to post.  Now that it’s up, I have exacted my revenge.

I must say that I think Rich’s choice of automation controllers was top-shelf, but I think I might have gone with a better hot tub controller because I seem to have confused it and now it will only heat to 73 degrees.

I also think he should have gone with better carpet.

I’m pretty sure his wife is going absolutely bonkers given the fact that the lights in the den keep blinking to the beat of a Lionel Ritchie song and the garage door opener keeps trying to attack the gardener.  I will let you know that I’m being a gentleman and not peeking at the CCTV images…much.

Let this be a lesson to you all.  When it comes to predicting SCADA attacks, don’t hassle the Hoff!

/Hoff

Categories: Punditry Tags:
  1. Ted J
    December 13th, 2007 at 11:50 | #1

    Hoff, you've outdone yourself – this is classic.

  2. December 13th, 2007 at 12:30 | #2

    Chris, you're a jackass, and I say that with the utmost respect. I'm interested to hear Rich's response on the next podcast. If it's something we can publish in a PG podcast.

  3. December 13th, 2007 at 13:06 | #3

    Sweeeeeeeet. Just goes to show you, security professionals cannot trust their friends. ๐Ÿ™‚ Not completely anyway.
    Go forth and do good things,
    Don C. Weber

  4. Serge Maskalik
    December 13th, 2007 at 16:12 | #4

    Chris – agreed on the Scada risk; top of mind issue for lots of oil/gas folks…

  5. December 13th, 2007 at 16:35 | #5

    Rich, what part of Arizona DO you live in?? That doesn't resemble AZ at all… Unless…..

  6. December 13th, 2007 at 16:47 | #6

    @marcin:
    I could send you the lovely image of his mailbox/address since he has a webcam aimed at it (not shown in the picture) but that would be cruel.
    He's on his way back from the UK now and obviously didn't see this post when he posted this one on his blog announcing his "smart house" was malfunctioning….buahahahaha… http://securosis.com/2007/12/13/off-topic-argh-sm
    Something tells me I'm going to pay for this in about 18 hours…
    Ah well, all's fair in love and blogging.
    /Hoff

  7. December 13th, 2007 at 17:56 | #7

    thats just awesome, cant wait till you can do that with a fully networked IPv6 home, should lead to some good fun

  8. December 14th, 2007 at 06:33 | #8

    Nice work, Hoff… always good to pwn your detractors.
    /me goes to check my blog for any comments critical of Hoff…

  9. December 14th, 2007 at 06:42 | #9

    Don't hassle the Hoff…

    Okay, so I've been seriously remiss in posting for, oh, say the last six months, but I couldn't resist mentioning this one.
    Recently, Mogull posted an article on Dark Reading decrying the nature of security predictions. Shortly after writing the art

  10. December 14th, 2007 at 07:29 | #10

    I agree with Andy, I think this is an important story illustrating an easy and common means to pop someone, even a security professional. Let alone all the stuff you can pwn in his house as things become more connected.
    While I think this is funny and a good lesson for us, technically, what you did is still illegal, so I wouldn't suggest people just start doing this to other people. Friends, sure, you can get away with it, but don't do this at home, kids.
    (Of note, it's nice to see a security pro with some chops to DO things…too often I see a lot of talkers who couldn't change a firewall rule manually if they were put on the spot…)
    I was just last night thinking about security and home users and how, in the past, it was not necessarily that easy to target someone at home. Let's say a friend of mine from work pissed me off. Could I track him down to his home IP and get behind his perimeter? In the past, this took some work and luck with his open services, but these days circumventing all of that stuff through the browser has made such targeted attacks extremely potent and viable. Sure, such attacks have existed in the past, but they're no longer the realm of only the uber criminals. Random people on MySpace can do this stuff now…

  11. December 14th, 2007 at 08:48 | #11

    Illegal!? Gosh. Are you sure?
    I invited him to participate willingly in a service I was offering. I disclaimed it was a security research application and would he mind helping me review it by going to my website.
    He said yes.
    It's not like I told him I was the deposed Minister of Culture for the Congo or sent him ads for viagra or something!
    ๐Ÿ˜‰

  12. December 14th, 2007 at 09:21 | #12

    This is why command and control or sensitive data needs to be on a separate network, that has limited (if any access to the internet). If access is needed to the internet for sending alerts a one way firewall where only SMTP traffic is allowed outbound.
    I can't wait to see the fallout over this, I like the benign nature of the hack^H^H^H^H security research.
    — Tim Krabec

  13. Jake Brodsky
    December 14th, 2007 at 10:03 | #13

    Congratulations, you've proven the obvious. You've shown the world that an exposed SCADA system is a dead SCADA system.
    Now a word from those of us who really perpetrate and run these things. Connection to local office networks is almost a certainty in most SCADA systems. The question is how well it is secured.
    Can a red team get through? Sure. Would we notice? Eventually. One thing that operators of our distribution system know is this: They know what a typical day looks like. They know what commands they would usually issue at a given time of day given the conditions they're in.
    Sooner or later, someone would see something they knew damned well they didn't do. You'd have to have exquisite knowledge of the distribution system's behavior before you could sucessfully spoof an operator in to thinking everything is normal. I'm not saying such knowledge collection is impossible. But I am saying that this is a lot of effort: So much effort, that among the likely attackers, most would choose a different attack vector.
    The only outsiders who might be interested in attacks and who might have the means and motive to carry out such an attack would be Red Teams. We could make a hyper secure SCADA system. But the availability would drop, and the Operators would be very busy circumventing most of the security measures we'd install. It would make more economic sense to staff key distribution stations 24/7 than by trying to develop a super secure SCADA system.
    That said, I am one of many who actively pursue the obvious and straightforward things we can do to improve local security. It's a long, difficult thing to do. What you may not realize is that EVERY new peice of hardware or software needs to be carefully tested and validated. It's very expensive. Even then, we still run a risk that something unexpected may happen from a blown patch. I got that T-shirt, and wore it out. It's a rag now.
    This brings me to your earlier Blog concerning complexity: You can't rely on mental clarity in the wee hours of the morning. If you expect to run a 24 hour operation, you'd better keep Control systems as simple as possible. I'm not saying that additional complexity is never warranted; but I would like to point out that foisting excess complexity on people who work the midnight shift could easily backfire.
    Please understand that this is very much a multidisciplinary field. Not only do you have all the considerations for SCADA security, you also have engineering concerns, economic concerns, human factors, and so on. It's easy to make cute demonstrations. It's much harder to build a workable, reasonably secure system.

  14. December 14th, 2007 at 10:08 | #14

    I usually spar with Jake in various SCADA Security forume, but I agree. Silly PoC attacks against non-production systems have been in vogue for the last five years.
    Hoff, stay away from SCADA if you want to maintain your credibility on other topics.

  15. December 14th, 2007 at 10:59 | #15

    That is hilarious.

  16. December 14th, 2007 at 19:28 | #16

    @Matt:
    Take a deep breath. It's apparent something's screwed on just a little too tightly over there.
    The simple fact that you and Jake are implying that you and the rest of the world have SCADA security all sewn up and it doesn't need any attention is exactly why it needs attention.
    I mean with levels of proactive precision like this precious quote from Jake:
    "Can a red team get through? Sure. Would we notice? Eventually."
    …I can see why we have nothing to worry about. (rolleyes)
    You're COMPLETELY missing the point of all of this. But since I now have to fear that my entire credibility is at stake, I'll have to re-evaluate my entire awareness campaign.
    Done.
    Thanks.
    /Hoff

  17. Jake Brodsky
    December 15th, 2007 at 07:47 | #17

    Oh gosh, where do I begin Chris?
    What do the first letters of SCADA stand for? Supervisory Control.
    A real SCADA system doesn't issue direct controls. It issues Supervisory Controls. There should be no time critical control loops in SCADA. In other words, we have vulnerabilities. But they won't destroy anything right away. We engineers know better than to trust complex software.
    Most good design practice is based upon graceful degradation. In other words, we don't send a command to open a valve. We send commands to change the pressure differential setpoint. A local controller takes care of the rest. There are sanity checks in the local controller.
    You could send commands to the field that would screw things up. But most people would notice and we'd take action. Keep in mind, that while our operation is very careful and deliberate, the distribution system was built for some wild extremes including pipe breaks, extreme weather, communcation outages, and vandalism. A successful attack would require intimate knowledge of where the real vulnerabilities are.
    Are you an expert at water utilities too?

  18. The easter bunny
    December 15th, 2007 at 08:28 | #18

    Oooh, I'm so relieved to know that there's "should be" no time critical controls. There also "should be" no fucking stack based buffer overflows in 2007. In other words, we probably have both.
    The idea that operations in energy are careful and deliberate is proven wrong by the blacking out of the East coast in Aug 2002.
    Of course, Jake, I respect your authority as a SCADA expert. Please respect mine in shit that ain't real.

  19. Jake Brodsky
    December 15th, 2007 at 09:29 | #19

    You construct a straw man argument by pwning your buddie's demonstration SCADA system and then claim to know what "shit ain't real"?
    We could really use the help of someone like you. But if you cite trivial demonstrations like this, nobody will take it seriously. Just ask those nice folks at INL what the fallout of the Aurora exercise was like. They put a hell of a lot more realism in their exercise than you just did.
    There aren't many who pursue the field of SCADA security. It demands a lot of experience on both the software and the engineering side of the fence. You need to know a lot about standards that are already in place as well as the gaps that need to be filled.
    Ask Matt what his experience was like. Yes, there are some really egregious flaws in this business. We need better standards, better system software, better network protocols, better engineering guidelines, improved flaw reporting, and so forth.
    You want to poke holes in a cheesy demonstration? Enjoy yourself. But I'm no more impressed with this than I would be from some kid who spray paints a water tank.

  20. The easter bunny
    December 15th, 2007 at 09:46 | #20

    I've talked to Aaron about the aurora stuff.
    As the easter bunny, I'm trying to make fun of you. I'm glad to hear you want participation and help. I know it's rough. But coming in and claiming that demonstrations are "trivial" or "obvious," … sometimes obvious is a fine counterpoint to arrogance and proclamation.
    Other times, you have to hop around in a fuzzy pink suit dropping a trail of eggs and talking in a pipsqueak voice.. Like a fucking three day blackout.

  21. windexh8er
    December 15th, 2007 at 11:00 | #21

    Chris, you are obviously full of yourself — but for some reason I like your style. However a part of me hopes that ISC2 revokes your CISSP for blatantly breaking your signed code of ethics as an example. Full disclosure is great, and I think that Rich is a bit of a hypocrite with regards to security (i.e. I have noticed he doesn't exactly practice what he preaches – and this really proves it). On one hand I like the fact that you have a strong technical ability with regards to security, most "podcasters" don't. Case in point, Rich M, Martin M, Larry A, are all full of compliance and paper security. I mean, realy — WRT54G hacking? Give me a break. Professionals don't do courses like that — high school teachers do.
    Back to ISC2, might want to read the code of ethics again buddy: https://www.isc2.org/cgi/content.cgi?category=12
    You could have done this a little more gracefully. But, it's hilarious regardless.
    In the end — I've contributed to the rule sets for IPFW and I'm not exactly sure even the ruleset that is posted would have helped him. It's sad that a computer professional doesn't have better network bearings!

  22. December 15th, 2007 at 12:15 | #22

    Attention all zealots and crusaders.
    While I am *not* the Easter Bunny (I wish I could write so well…) I am obviously the Anti-Christ or at least a good
    rendition thereof.
    However, since this is getting to that inflection point where people can't read between the lines and start to go off the deep end…
    Just to let you all know, Rich and I concocted this as an awareness campaign to address two things:
    1) Reinforce the point that most of you got which is to be careful when you go "clicky" on things, even when you think it's friendly, and
    2) Bring to the surface the fact that with systems like SCADA which continue to be described as "secure" or "hard to hack" based on security by obscurity (because *nobody* knows *anything* about them…) they are, in reality, just another stack-connected click away from silliness.
    …and I've participated in my fair share of assessments of utility providers as well as security engineering design meetings to suggest that despite all the good posturing above, crappy deployments still exist.
    I think it's hysterical that lots of people think of "attack" an immediately equate it with busting down the front door and owning a network and its assets. Sometimes an attack and the prize is recon. and intel. Sometimes it's not about hostile take over. Sometimes it's about leverage and information.
    …but I'm not a water utility expert so I couldn't possibly fathom (pun intended) that, now could I?
    This "cute" little fictional hack merely underscored the facts we all know but seem to forget.
    Finally, thank you all for your concern regarding my credibility and CISSP credential. It's great to know you're looking out for me.
    Carry on.
    /Hoff
    (P.S. Damnit Marcin, you almost blew the gag…no, that's not Rich's house in Arizona…you get 10 points.)

  23. windexh8er
    December 15th, 2007 at 12:49 | #23

    Honestly, if this little game was truly planned I still think it represents a lack of credibility on everyone who was playing. It does show a lack of ethics, and as Chris is very good at manipulation through strong verbage I now see it all as a sad ploy. Lame if you ask me. A sad "trust no one" soap opera.
    Kudos if you "got me". Too bad Rich had to look like the retarded security guru though.

  24. December 15th, 2007 at 13:00 | #24

    @Winhdexh8er:
    It went from hilarious to sad/lame all in the span of one post!?
    Darn.
    Go ahead and read Rich's post on the matter here: http://securosis.com/2007/12/15/end-of-year-humor
    Since you wound it up to the high moral ground of 'ethics' and turned it from fun to 'ass control' we decided to can it before you called the Feds.
    ๐Ÿ˜‰

  25. December 15th, 2007 at 13:41 | #25

    If nothing else comes of this, at least my faith in the Easter Bunny has been restored.

  26. windexh8er
    December 15th, 2007 at 14:15 | #26

    Because if it actually was the case it would be an issue of ethics.
    Maybe you misunderstood me. It was hilarious, when it was plausible, with regards to your target. Now that it's known that it was a game you and Rich were playing, to me, it's lame. In a lot of ways it seems beneath you. Then again, you're the one that's easily wound up and goes off the handle, not me. I was stating the obvious — if it were real, it would have been an ethics issue as well as illegal.
    I'm just confused as to why you thought you wouldn't get this feedback at some level from the audience you preach to. I just can't wait until April 1st, that's for sure!

  27. The easter bunny
    December 15th, 2007 at 16:21 | #27

    Dear Jack Daniel,
    I feel the same way about you. At least when you're being a gentleman.

  28. December 15th, 2007 at 23:45 | #28

    Hoff,
    I appreciate the intent of what you're showing here. You would be amazed at how many very high profile and dangerous industrial sites are controlled by systems that have egregious security flaws – we're talking plaintext instructions with no password protection that can be interfaced by serial port or sometimes over a "control LAN", which is most often necessarily connected to the business network. Worse yet, it's always production (availability) that is KING, so you end up with bad practices like shared passwords and a total disregard for security as the norm. In my experience at many plants, large and small, in a variety of industries, if you walk around like you know what you're doing with a hard hard and a laptop you can do what you want. I've worked with many industrial consultants (Integrators/PLC Programmers/etc) and the consensus is "security by obscurity". How could you possibly do damage if you don't know which register shuts down the plant? What, they keep documentation on hand? Well then, how could you possibly do damage if you're not a waste water engineer? Please… It doesn't take a genius to figure out how nasty an Ammonia leak would be – for example.
    IMO a huge issue has grown with the evolution of modern control systems (SCADA, HMI, DCS, etc). It used to be PLCs wired directly to devices. Networks were proprietary with respect to computing and closed. Ever heard of "blue hose"? DH+, DH-485? Probably not – that stuff was expensive and slow. As everything began to shift to IP based networks you'd see "industrial grade" hubs with isolated networks – fine. Well at the same time industrial software has sucked bigtime – think COM based communication models that don't include anything in the way of security. IT departments would get themselves in trouble by doing benign things like windows updates or service pack installs on "controls" computers, breaking the fickle systems and leading to expensive downtime. Engineers of the mechanical and electrical type who know very little about computers think they're the experts on networking and computer security. What do IT and computing experts know? All they do is break the *different* industrial systems. Roll this a few years forward to IP based systems that are necessarily distributed. The boss now expects to see realtime production data and reports from home – and not just his plant, all of them. IT and engineering have to work together. IT sees hubs, unpatched machines, and IP addresses like 1.2.3.4. The control guys think IT slows everything down, don't understand their systems, and might break them.
    These blog posts by Walt Boyes, Editor in Chief of Control Global magazine illustrate my point. One is about "control systems being different" and the other that "control systems aren't a sub-discipline of computer science". I generally agree with Walt, but feel like the focus on standard network and computer security gets too often overlooked in the industrial space. Also, the security by obscurity argument comes up too often as a defense to bad practices by professionals in other fields. http://www.controlglobal.com/soundoff/?p=1483 http://www.controlglobal.com/soundoff/?p=1482

  29. December 15th, 2007 at 23:46 | #29

    Hoff,
    I appreciate the intent of what you're showing here. You would be amazed at how many very high profile and dangerous industrial sites are controlled by systems that have egregious security flaws – we're talking plaintext instructions with no password protection that can be interfaced by serial port or sometimes over a "control LAN", which is most often necessarily connected to the business network. Worse yet, it's always production (availability) that is KING, so you end up with bad practices like shared passwords and a total disregard for security as the norm. In my experience at many plants, large and small, in a variety of industries, if you walk around like you know what you're doing with a hard hard and a laptop you can do what you want. I've worked with many industrial consultants (Integrators/PLC Programmers/etc) and the consensus is "security by obscurity". How could you possibly do damage if you don't know which register shuts down the plant? What, they keep documentation on hand? Well then, how could you possibly do damage if you're not a waste water engineer? Please… It doesn't take a genius to figure out how nasty an Ammonia leak would be – for example.
    IMO a huge issue has grown with the evolution of modern control systems (SCADA, HMI, DCS, etc). It used to be PLCs wired directly to devices. Networks were proprietary with respect to computing and closed. Ever heard of "blue hose"? DH+, DH-485? Probably not – that stuff was expensive and slow. As everything began to shift to IP based networks you'd see "industrial grade" hubs with isolated networks – fine. Well at the same time industrial software has sucked bigtime – think COM based communication models that don't include anything in the way of security. IT departments would get themselves in trouble by doing benign things like windows updates or service pack installs on "controls" computers, breaking the fickle systems and leading to expensive downtime. Engineers of the mechanical and electrical type who know very little about computers think they're the experts on networking and computer security. What do IT and computing experts know? All they do is break the *different* industrial systems. Roll this a few years forward to IP based systems that are necessarily distributed. The boss now expects to see realtime production data and reports from home – and not just his plant, all of them. IT and engineering have to work together. IT sees hubs, unpatched machines, and IP addresses like 1.2.3.4. The control guys think IT slows everything down, don't understand their systems, and might break them.
    These blog posts by Walt Boyes, Editor in Chief of Control Global magazine illustrate my point. One is about "control systems being different" and the other that "control systems aren't a sub-discipline of computer science". I generally agree with Walt, but feel like the focus on standard network and computer security gets too often overlooked in the industrial space. Also, the security by obscurity argument comes up too often as a defense to bad practices by professionals in other fields. http://www.controlglobal.com/soundoff/?p=1483 http://www.controlglobal.com/soundoff/?p=1482
    (sorry – had to cut the https for the spam filter)

  30. Jake Brodsky
    December 16th, 2007 at 09:30 | #30

    This is about professional arrogance.
    I've straddled IT, Control Systems, and Telecommunications for my career of more than 20 years. Believe me, I've known some incredible personality disorders. Some of them have even been mine.
    Walt's attitude comes from the problems we've seen when we try to place a knowledgable person from the IT side of the fence with a knowledgable control systems engineer. Both fields are very good at building some impressive Prima Donna attitudes that "They Know What's Best."
    You have to humble yourself first. You have to listen to what others are saying. They really do know what they're doing most of the time. What most don't realize is that the philosophies, priorities, and policies of these most other applications compared to control systems are nearly orthogonal to one another.
    For example, in the scheme of Confidentiality, Integrity, and Availibility, most offices want to make sure Confidentiality is key, and Availability is not a major issue. It's exactly the reverse with control systems. Confidentiality is really not much of an issue, and Availability is everything.
    That's just a taste of some of the differences. So you put a dyed in the wool network "expert" in the room with a Control Engineering "expert" and you'll be lucky if there isn't blood on the floor in five minutes. This is not "just another app." Network security measures must be carefully applied so that under no conditions will it ever saturate traffic. If we managed our networks the way most offices did, you'd have industrial disasters all over the place. Give us a bit of credit for knowing what we're doing.
    The reason this is even an issue is because those insipid glossy IT porn rags suggested years ago that a CEO might want to know how many widgets passed the label sensor minute by minute. It's a stupid idea, I know. Yet, it looked really cool, so it happened, despite the objections of many people like me.
    But back to the present: here are the problems with your demonstration
    1) You used a Trojan launched through an IM (damned few SCADA systems have IM or any reason to use it); This is not a plausable attack. In fact, we get rid of most of the very biggest attack vectors by eliminating all e-mail access in a control system, and by turning off Autoruns everywhere. I know it's not impossible to get past this, but it does keep out most script kiddies.
    2) Once you were in, you presumed that you'd know what to do. Most control systems have thousands if not tens of thousands of points: quick, find the ones that matter.
    3) Many assume you can do deadly things with the control system right away. They fail to realize that there are auxilliary controllers and other embedded safeties. Note to software professionals: wherever possible, we avoid the use of software for safety purposes. I know some PLC manufacturers love to sell their gear as SIL-rated. However if you think that anybody worth a PE certificate will casually stamp a set of plans using a networked PLC to effect safety, think again. I'm not saying it never happens, but there isn't nearly as much of it as most people would think.
    4) Past experience suggests that the far more common threat is from insiders. We need light-weight, yet distributed identity and key management systems. Domain Controllers are commonly used, but they're hardly ideal. Remember, we're all about availability and distributed control. A DoS attack a domain controller could really screw things up.
    5) QT is not commonly used in control systems. I hate to admit this, but Microsoft Windows is often the platform of choice. Here's where the ignorance and illusion of inexpensive software got the better of many control systems engineers. Microsoft has a complex security model. Most engineers have no idea how to manage it in their specific applications. Once again, the difference between a typical office application and a control system really makes a mess.
    Now if you want to make a real splash, pick a common SCADA subsystem, and show how to secure it. Eric Byres and Dale Peterson did this with their three white papers on OPC, Or, if you prefer pen-testing, show how an application such as Industrial Defender's RTAP can be taken over.
    This is a steady, interesting source of work for many of us. There is plenty of work for many more. If you're interested in more than just pot shots at irrelevant targets, you'll be welcome. But you'll have to do better than this.

  31. January 13th, 2008 at 01:14 | #31

    This is a pretty funny story. Wish you had more detailed info and photos. A follow up might be in order ๐Ÿ˜‰
    Hani http://www.makingclicks.net

  1. July 29th, 2011 at 13:05 | #1
  2. May 10th, 2013 at 22:16 | #2