Category: News

  • RedHat gets hit this time…

    It just goes to show, if you think you’re safe, you’re not. This time RedHat was hit:

    http://blogs.zdnet.com/security/?p=1784&tag=nl.e550

    This is pretty ugly since it involves the signing of certificates used to validate the RPM repositories and RPMs themselves. RedHat claims that the “passphrase“s for the certificates weren’t compromised, so no harm no foul. However it’s very concerning and in order to sufficiently mitigate may require manual intervention by all users or at least changes on all users’ systems.

    The problem here is if RedHat is wrong, forged RPMs could be created that appear “valid” and in theory if installed could infect customer systems compromising binaries et al. It would take quite a bit of effort here, including getting the RPMs into the repositories without anyone noticing, but it is not out of the realm of possibility, particularly when you consider what this hack in itself says about security.


  • Brilliant article with x-Hannaford CIO

    StorefrontBacktalk has a short but brilliant article with the former CIO, Bill Homa, of Hannaford grocery chain who suffered a major breach of credit card data:

    http://storefrontbacktalk.com/story/071108homa

    There are three particular points that stand out:

    1. That Microsoft is still so hole ridden as to put your company at additional risk.
    2. That PCI is still not sufficiently strong.
    3. That a security posture based only on perimeter defense is ultimately fallacious.

    In my experience PCI (also called CISP or PCI DSS) while certainly better than nothing, is still well below what is necessary to protect customer confidential data. Furthermore certain components of the credit card processing stream require less than ideal levels of encryption (I’m being generous here), providing simplified points of collection and attack for hackers (to note, there are plans to improve this).

    In regards to depending on “perimeter defense”, this quote particularly stands out:

    Most retailers have the philosophy of keeping people out of their network. It’s impossible to keep people out of your network. There are bad people out there. How do I limit the damage they can do? If you don’t do that, they’ll have free reign to do whatever they want.

    However I hardly think this mentality is limited to retailers. In my discussions with numerous peers in the computing industry, many shops, large and small, retail or non-retail are inflicted with this mentality. In fact I would consider it pervasive – “keep the intruders out and you’ll be ok.”

    But the honest truth is you can never keep them out and like a game of chess, everyday some new hole is found to subvert your external protections. Nor for that matter should you really trust your own employees, which are ultimately one of the largest sources of data compromise, and they are on the inside.

    The answer is “defense in depth“, with layers of security, some strong, some weaker, some on perimeter, some on the host, some in the software tools themselves, but the sum total providing sufficient security for the value of the asset(s) being protected (based on “risk analysis“).

    Until corporations start thinking this way, we can expect to see breaches like Hannaford’s continue for some time.


  • Pretty cool Google tool…

    I’m not sure how useful this would actually be in practice, but this “Goosh” or “Google Shell” is a pretty neat trick:

    http://goosh.org

    which is an unofficial command line tool to access Google.

    For those of us Unix types, it’s fun to see it presented this way, though as noted I’m not sure how useful it really is…


  • BIND DNS “replacement” released

    NLnet Labs, Verisign, Nominet, and Kerei have announced the release of a new DNS server to potentially replace ISC’s BIND:

    http://www.unbound.net/

    Built from the ground up, it’s supposed to be faster and more secure, in part supporting DNSSEC out of the box.

    Of course as with all new software, it remains to be seen if some major flaws/holes are found in it. This is where older software typically has an advantage, despite any claims otherwise, since “trial by fire” is usually where the major issues are found. At the organizations where I hold sway, we will probably delay any possible implementation, letting others work through the issues first.


  • Another one bites the dust…

    It’s a little sad, yet another Linux holdout has fallen to the Microsoft world:

    http://news.bbc.co.uk/2/hi/technology/7402365.stm

    In some sense it’s probably a good thing – much as I like Linux, the world is run on Microsoft and getting used to its interface is probably more useful to children than Linux. On the other hand, a lot of servers run Linux and a Linux laptop would offer a lot more options to learn how to maintain these systems and possibly program. I could theoretically see “gcc” installed on a “100 Dollar Laptop”, but Visual Studio seems out.

    It’s a bit sad that Microsoft can’t leave at least one market alone, and it’s hard to believe this isn’t “monopolistic” in some sense. Anyway, here’s another related post:

    http://blogs.zdnet.com/computers/?p=170&tag=nl.e589

    I realize it’s the way of things, but it does seem to close a lot of doors.