Home > General Rants & Raves > Loose Lips Sink Ships But They Also Float Boats…

Loose Lips Sink Ships But They Also Float Boats…

Mouth_tape
I’m going to play devil’s advocate again as I ponder a point.  Roll with me here.  I’m slightly conflicted.

Jeff Hayes blogged about an interesting encounter in a sports bar he had with the head of physical security for an international accounting firm.  It turns out that as part of a casusal conversation, this person disclosed some very interesting facts about his company’s security:

It turns out this guy handles physical security for a major
international accounting firm. He travels around North America doing
premises and access control assessments and deployments. He described
to me, without me asking specific questions, the technology they use,
the problems they deal with including the push-back they get from each
office complaining about burdensome security, their budgets, his
working environment, how he moved up the company ladder and his
qualifications or lack thereof, and a number of other tidbits that
would prove valuable to anyone doing surveillance.

It would appear that this guy had one too many and the apparent level of detail disclosed seems excessive.  Jeff’s point about confidence and accelerated reconnaissance for targeted profiling seem to be quite relevant in this scenario.  This person was being reckless and was potentially endangering his company.

However, let’s look at this a little differently to illustrate a counterpoint.

This encounter sounds like what many of us read and talk about under the guise of non-attribution at many of the security forums and "professional" security gatherings we attend and participate in with our "peers."  You know the ones where we all sit around, hoping that the badges actually represent the fact that the organizers have appropriately vetted and authenticated that the person wearing it is who they say they are…

Moreover, it sounds a lot like the conversations at the bar after said forum roundtables.  We share our collective experiences in order to gain insight and intelligence so we can improve our security posture, accelerate our intelligence on short-listing vendors and not make mistakes by learning from others.

How about those Visio diagrams you show on the whiteboard to VARs when they send their SE’s in for work and pitches?

It gets even more interesting when you have CISO’s/CSO’s (like I do) talk to the press and do case studies describing technologies and solutions deployed.  Some CISO’s don’t mind doing so after making a tactical risk-based decision that what they reveal does not expose the company adversely.  Others simply don’t talk at all about what they do.

I understand there exists the potential that by disclosing that you use
vendor ABC or technology XYZ that someone could exploit that knowledge
for malignant purposes.  I suppose this is where the fuzzy area (I’m sorry Mr. Hutton!) of
thin-slicing and quickly assessing risk comes into play.   What is the likelihood that this
information when combined with a vulnerability (in policy, architecture, deployment) in the presence of a
threat might become a risk to my company?

I use Check Point NGX R65.  I run it on a Crossbeam X-Series.  It filters a bunch of packets.  I use Cisco routers.  Is that information you couldn’t have found out with a network scan, fingerprinting and enumeration?  Have I made your job of attacking me orders of magnitude easier?

Ah, the slippery slope is claiming me as a victim…

Have you seen the Military Channel?  I watched several fantastic Navy/Marine-sponsored documentaries on Carriers, NextGen APC’s, new weapons systems…all of which are deployed.  Is Al Qaeda now in a more advantageous position because they know how the de-desalinization plant on a fast frigate functions?

Everyone in a company is both a sales and marketing rep as well as a
potential security breach waiting to happen. Most businesses like
people to present their company in a good light. We want people to know
that we work for a good employer. What we don’t want people to do is to
tell others how crappy our employer is. Likewise, we probably don’t
want our security personnel describing the details of our security
systems, policies and procedures.

So Jeff’s right, but I guess that depends upon the level of "details" he’s referring to?  Is Jeff’s point still valid when we’re talking about a breakfast conversation at an Infragard meeting?  How about the forums over at SecurityCatalyst.com?  There’s that level of trust and judgment factor again.  How about an ISAC gathering?  Aren’t we all supposed to share knowledge so we can help one another? 

Where do we draw the line as to who gets to say what and to whom?  Those policies either have to get really fuzzy or very, very black and white…which goes to Jeff’s point:

Loose lips have been known to sink ships; they can also hurt organizations.

Yes they have.  They’ve also been known, when appropriately pliable with a modicum of restraint, to float the boat of someone whose time, energy and budget you’ve been able to save by sharing relevant experience.  Let’s be careful not to throw the baby out with the bilge water.

So, how do you establish "trust" and assess risk before you talk about your experience with technology you’ve deployed or are thinking about deploying?  What about policies and procedures?  How about lessons learned?

Obviously anybody who answers is not a true "security guy" 😉

/Hoff

Categories: General Rants & Raves Tags:
  1. October 13th, 2007 at 17:27 | #1

    First, I never claimed to be a true "security guy"- I'm just a wannabe, so I'm comfortable answering…
    Bottom line, sharing info is more likely to solve problems than create them. Even in a past life as a mechanic, the bull sessions after tech training and seminars were often more valuable than the material officially covered. Outside of formal settings we get to tell the truth and react to lies we've just heard.
    As far as the data leakage, most of the info you are likely to hear could be discovered by any reasonably competent attacker (or curious bystander). Is your company: publicly traded? have government contracts (or bid on them)? ever been involved in a court case? have any real estate transactions? pay taxes? All of these lead to some publication of "internal" data. I'll put almost any of these confidential conversations up against Johnny Long, his cell-phone camera and Google; you probably couldn't talk fast enough to leak what Johnny could find. Throw in Facebook, MySpace and Linkedin, and you get a pretty good picture of the players, too.
    Sure, you can be too candid with information and you can share with the wrong folks, but (still under the delusion that it exists in production) "a little common sense" is probably good enough. I think most people who deal in security develop a kind of "reputation based firewall" for personal interaction that serves them well. You asked about the impact that venue has on discretion, that is just one factor. I think the mental firewall also includes a web of trust function; connections are made between those you know and don't, trust is infered and transferred (as is mistrust).
    So, the guy in sports bar is an extreme case and he needs to apply a service pack to his common sense. Drunks do dumb things, this is new how?

  2. October 13th, 2007 at 19:34 | #2

    Jack:
    There you go with that "common sense" thing again 😉
    BeanSec! wouldn't be worth attending if we didn't have a free exchange of information. Even with a beer or two in folks, I don't think we've had a breach of trust, etiquette or manners yet…
    OK, that last one was a bit of a reach…
    /Hoff

  1. No trackbacks yet.