Dear Security Team: You Suck!

Computer security isn't voodoo, it's part of computer science. The cornerstone of the scientific process is a repeatable experiment with verifiable results. Security should adopt this approach. First, measure the environment and establish goals. Next test for cases where you can address issues to meet goals. Develop a process for systematically addressing a priority and a separate way to measure progress. Establish a periodic review so that you can evaluate your success (or lack thereof). If you can do this then you're well on your way to establishing a mature, respected security organization that can demonstrably add value to any organization.

Goodbye Drupal

I finally moved my site off of Drupal as a content presentation technology. This decision was the result of a number of factors, including the poor content management capabilities of Drupal, the security implications of the massive code base, the fact that the administrative interface lives in the web root and is accessible globally, and the resource intensive nature of the system, which was causing my site to crash.

Secure Web Application Penetration Testing

Engaging in a web application penetration test (pen test) is an increasingly common task for today's infosec professionals. Sadly, there's not a whole lot of guidance about how to begin this process with respect to basic logistics. A lot of testers simply set up a web server, install the software to test, and begin pounding away. This approach presents a number of issues, however. Careful consideration and measured approach are better investments of time and effort.

User Interface is Security

As security exploits traverse up the OSI model to the application layer exploits that manipulate user interface are becoming more common. The root of this problem is that layer 8 (or the human operator) is the ultimate security vulnerability. Programmers develop systems that can do amazing things, but they often fail to consider the "average user" when putting together their user interface (UI). This leads to a situation where users commonly misunderstand, or fail to understand, a particular piece of program functionality.

APT is Real Enough

Advanced Persistent Threat (APT) is a term coined by several individuals in the information security community and championed by Rich Bejtlich ( and Mandiant Corporation ( There has been a lot of criticism of the term, and it's general application to connote state sponsored attackers. In the past I have been generally skeptical of the APT concept, but I'm beginning to change my mind on the matter.

Goodbye Android, Hello iPhone

I'm getting an iPhone today after years of loyal Android OS use. Why you may ask? Well the Android security model finally got to me after a McAffee report noted a massive jump in Android malware that made it the most targeted mobile platform. Given that the iOS market share is far larger than that of Android this trend made no sense. Examining most malware trends shows that the vast majority targets the platform with the highest availability.

Browser security

In episode 8 of the Eurotrash Security Podcast the subject of browser security was discussed. The guest, Jayson E. Street, asserts early on in the interview that the number of vulnerabilities present in a browser is not an accurate metric of security. Instead, he cites the "mean time to patch" metric that is often referred to when gauging vulnerability management. Mean time to patch is the time it takes for a vendor to release a patch or fix to address a security vulnerability.

First Confirmed Drupal Brute Force

Recently we instituted the Drupal Login Security module on our Drupal sites. This module alerts the site administrator of multiple failed login attempts among other defensive mechanisms. I installed the module in response to a proof of concept that I developed that explored how attackers could enumerate and then brute force Drupal accounts.

Objects Gone Wild (in a bad way)

Recently while performing an application security audit I had the misfortune of discovering probably one of the worst uses of OO I've ever seen. Object oriented (OO) programming is a powerful tool that can be used to abstract data conceptually into modular, easy to utilize (and maintain) components. There are a few general tenants of OO that lend use to object oriented design. One is that objects should be self contained. Objects should represent something, rather than serving as a container for functions.

Educause Security Professionals Conference 2010

It's day 1 of the Educause Security Professionals Conference 2010 ( in Atlanta. I'm going to try and Tweet a bit of the conference ( under the hashtag #ESecPC and hopefully blog some as well. I'm presenting a session tomorrow on OSSEC ( and serving in a multi-person panel on a hot topic discussion around secure application development lifecycle. So far there seem to be a few trends I'm noticing, one being web application security. I'll try and pull together other trends that I notice as well.


After a long hiatus and some prodding from the community ( I decided I needed to start blogging again. Unfortunately when I went to update my site last night I found that calls to the URL returned 404 errors. Perplexed I logged into my hosting provider and found that the directory that once contained my Drupal site had been replaced by a directory called "drupal_hackable_contact_admin" and the permissions had been changed to make it inaccessible.


Boy, what a terrible weekend! On Friday I found what I thought was a pretty amazing Drupal vulnerability. I reported it at the very end of the day. It was kind of a rush job as I was trying to get out the door, but I thought it was important enough that I wanted to send something in right away. I waited with anticipation for confirmation of my e-mail receipt from Drupal security. Friday night came and went with no word, Saturday waned and I stopped checking my e-mail in the late afternoon. I woke up Sunday morning to find a response from Drupal security.

My Generation

I was recently notified that danielkennedy74 was following my Twitter feed. It was slightly amusing to see a security professional using "74" post-pended to their user name. It took me back to a simpler time.

Psychological Acceptability 35 Years Later

Over 35 years have passed since Jerome Saltzer and Michael Shroeder published their seminal paper The Protection of Information in Computer Systems but it still holds many truths applicable to this day. The paper presents a number of "Design Principles" that should govern secure systems. One of the most insightful of those is 3h, or "psychological acceptability." The principle of psychological acceptability states that user interface should be crafted for ease as well as overlay user expectations.

Disclosure Revisited

Computer World recently ran an article about a Google security researcher who released exploit details for a 0 day exploit in Microsoft Help Center. The article is interesting because it addresses several different problems in the modern information security landscape.

Security is the Sexy Part of QA

While listening to the Risky Business podcast recently I heard it said that "security is the sexy part of QA" (Quality Assurance). These words really struck me due to their truth and insightfulness. As much as many security researchers would hate to admit it, QA testing is at the core of a lot of what we do. This is especially true with application security. Turning over my shoulder I can pick at least three QA testing related books on my shelf right next to others about rootkits, malware, and other security topics.

Distributed brute force attacks against Drupal

We're using a combination of Drupal 6 with the syslog module and OSSEC to monitor our Drupal web applications at work. I've noted a frightening trend recently of multiple failed login attempts for the same username from different IP addresses. This appears to be the work of a botnet. The following are some of the logs that we've gotten recently:

DLL Hijacking Storm a Brewin'

About three weeks ago a newly re-minted vulnerability class was announced on the Full Disclosure mailing list ( The class of vulnerability has actually been known to reverse engineers for a long time, but they used it to gain access to program flow to inspect existing code rather than for malicious purposes. In a nutshell the vulnerability hinges on the fact that some Windows programs will look for dynamic link libraries (DLL's) in their current working directory first before searching program and system directories.

Back Online

I've been completely offline for a little while. My wife gave birth to our twin sons extremely prematurely in early September completely unexpectedly. The boys were given a 20% chance to survive at birth. The older of my sons died in early October and I buried him a week later. My younger son remains in the intensive care nursery under constant supervision. It's been a really rough couple of months for me and computers were the last thing on my mind. I'm trying to start work part time again to resume some semblance of a normal life.

Enumerating Vulnerability

The simplest self contained definition of a software vulnerability is: any situation in which the software violates the stated security policy should be classified as a vulnerability. Using this definition it is easy to classify flaws in such a way as to avoid relying on published definitions of classes of vulnerability (such as XSS, XSRF, authentication bypass, insufficient anti-automation) as well as to avoid semantic debate about the term “vulnerability.” The challenge that faces open source security researchers is that there is rarely any stated security policy for systems under review.

Microsoft Takes My xBox And My $20

Microsoft, how do I hate you? My beautiful xBox 360, that I used to slaughter heathens, chat with my friends, and hear prepubescent boys swear like sailors died a few weeks ago after a brief and fatal struggle with e74. That is the red ring of death. I packaged up my console and shipped it off for repair. Three weeks later my xBox was returned to me!

LAMPSecurity Capture the Flag Exercises

I've been getting a lot of questions over e-mail lately about the project capture the flag exercises. These exercise are packaged as virtual machines that are vulnerable to root compromise through several series of exploits. The idea is to become familiar with "chained exploits" to compromise a target. Each exercise consists of a virtual machine and a PDF document containing step by step instructions. The exercises can be used for training purposes, as self tutorials, or as part of a penetration testing lab set up.

Why use open source?

Over the years I’ve given a lot of thought as to why people should invest in open source solutions. I’ve come up with a lot of reasons, some good, others just smarmy. Some of the reasons for choosing open source, however, may be a little less obvious than others.

Would Jesus Offer Open Wifi?

The EFF recently posted a "Call to Action" for an Open Wireless Movement ( The article lays out some compelling reasons for allowing open access to home wireless internet connections. The main reason they cite is civic duty, or community service. They state that providing digital access is a social responsibility that makes the world a better place.

Security Intelligence at Philly OWASP by Ed Bellis

At the latest OWASP Philadelphia ( meeting on May 23rd, Ed Bellis, CEO of HoneyApps, Inc. (, spoke to the group about security intelligence. It was a wonderful talk, titled The Search for Intelligent Life, and was very thought provoking (slides at In many ways, Ed's presentation was in response to the many ideas presented in Shostack and Stewart's The New School of Information Security, which I've previously reviewed (

Fist Impressions of Gnome 3

Fedora 15 recently launched with the addition of Gnome 3 (or Gnome Shell) as the standard desktop environment. So far you can color me less than impressed. Gnome 3 has some wonderful additions in terms of appearance. The fonts are smoother, there is window shading, and there are lots of neat improvements to window management (snap two windows side by side).

LulzSec Going Apeshit

I've been asked a lot over the past few days about my opinion of the Lulz Security crew. At the risk of raising the ire of some of the folks in the security community I'm going to side with Patrick Gray's take on them.

FEC Data Ripe for Mining

The US Federal Election Commission ( is a government body set up to, among other things, monitor campaign contributions. From a hacker or social engineer's perspective, the fact that the data collected is made public is sheer gold.

Thinking Security

Security vulnerability in code may be a permanent reality. Given the average number of bugs in software, and the moving state of security, it may be impossible to produce secure software. Added to this crisis is the fact that software is deployed on systems that are increasingly permanently connected to the network. This constant connection means a persistent, global threat. Traditional defences such as firewalls are increasingly proving useless at mitigating this threat. This situation leads us to question whether current security paradigms are working. What can the information security community do to combat a persistent, and growing, problem of vulnerable software being exposed to attack?

Software Security and Testing

Software security is a discipline that is closely aligned with software testing. In fact, the field of security could learn quite a bit from software testing methods and philosophy. Much of what we traditionally place in the security realm, such as vulnerabilities like SQL injection, is actually nothing more than a software bug - a failure to validate user input or the correctness of a query before execution. Much of the time software security researchers poking at code are merely doing what software testers do, only with a slightly different mindset. Software security testers look for bugs that can lead to unsafe conditions, rather than simply trying to inject faults. However, there is much overlap in the tasks of a software security evaluation and software testing. Fault injection is one clear area where these two activities overlap.

Where is the Documentation?

In the simplified software engineering process model there are 10 phases. These are: problem, requirements engineering, requirements specification, design, technical specification, implementation, coding, testing, system integration/deployment, and maintenance. No wonder then, that there are no good documents for software projects. Help documents, user manuals, deployment guides, and other useful texts often expected from software and rarely included aren't even part of the basic model of development. Testing gets short shrift in most projects, and unfortunately documentation gets even shorter shrift. Documentation is often left as the final stage of software development, and rightfully so. It is difficult to write documentation for a project before it is complete. You can't write help about features that aren't implemented or about screens that could change. Having incorrect documentation is almost worse than no documentation at all.

The Extra Nine Times

In his book 'The Mythical Man Month' Fred Brooks asserts that it takes nine times more effort to produce a consumable software system than it does to produce a program for internal use. This means that a programmer might create a program that does something for personal use in an hour, but to package that program into an application that others can use would take nine hours. At first blush this seems like a gross overstatement. After all, if a program works for one person, why wouldn't it work for others? I think this is a misguided assumption that many open source developers make and is perhaps one reason why open source software has such a bad rap. When one stops to consider the effort it really takes to make a software product the time/effort discrepancy becomes quite understandable.

Customers Need to Know Process Models

I'm currently taking a software engineering course which covers many aspects of software project management. The class is great because it covers a lot of knowledge that otherwise has to be won with hard earned experience. It's a curious phenomena that most technical managers are promoted to management positions because of their technical proficiency. Organizations take someone who is at the top of their technical game and suddenly switch them into a whole different role that they often times have to learn from scratch. Other professions do this as well, and it's always difficult. Just because someone displays technical proficiency doesn't necessarily mean they'll display managerial proficiency. Also, you take someone with years of experience and change their jobs into one which they have no experience in.

User Insecurity and Open Source Projects

Recently while reviewing a Drupal module for code security I noticed an interesting circumstance that got me thinking. The module itself had several security features built in, and adhered to the Drupal secure coding guidelines. However, the module had installation instructions that directed the user to set it up in such a way that it introduced a huge vulnerability into the overall installation. Thinking about this I wondered if it should be the module developers job to prevent the sort of situation that a user could induce by following the instructions. The installation guides should obviously be amended but even doing this wouldn't prevent a user from recreating the same dangerous situation. Who should be responsible for protecting users from themselves? Should the Drupal core code base prevent such situations from even being possible? It's arguable that they should.

Drupal Content Access Module XSS Fun

After my latest Drupal module vulnerability disclosure (a cross site scripting (XSS) vulnerability in the Drupal 6 Content Access module) I was contacted by a reporter from a pretty big British news agency who wanted details about the problem. Given the large number of Drupal sites on the web it must have seemed like a vulnerability was a pretty big deal. I quickly responded and let him know that the vulnerability remained relatively obscure and difficult to exploit and probably wasn't worth his time but it got me thinking. Security is a pretty broad field and I spend most of my time on one extreme. Asking me about computer security and privacy is probably a lot like asking a law enforcement agent about home security - you're going to get an answer colored by experience.

SEI Advanced Incident Handling - Day 5

The Software Engineering Institute, part of Carnegie Mellon University, and the organization that comprises CERT, offers an Advanced Incident Handling (AIH) course that I am currently attending.

SEI Advanced Incident Handling - Day 4

The Software Engineering Institute, part of Carnegie Mellon University, and the organization that comprises CERT, offers an Advanced Incident Handling (AIH) course that I am currently attending.

SEI Advanced Incident Handling - Day 3

The Software Engineering Institute, part of Carnegie Mellon University, and the organization that comprises CERT, offers an Advanced Incident Handling (AIH) course that I am currently attending.

Full Disclosure Policy

It has occurred to me, though my latest spate with the Drupal security team, that I need to clearly define my beliefs in full-disclosure so that there can be no misunderstanding as to my motivations. I've made attempts to outline my stance in the past, but I don't think I've given them enough attention. For this reason I wish to outline my philosophy vis a vis full-disclosure as clearly and unequivocally as possible. Thus, I believe: 1) I am not the smartest person doing vulnerability research. I have a limited amount of time to devote to security research. I believe there are black hats who are smarter than I am, who have more time than I do, who do exactly what I do. I believe it is not in the economic interest of black hats to disclose their discoveries. 2) Accepting #1 above; if I have discovered a vulnerability one must assume the black hats have discovered it.

SEI Advanced Incident Handling - Day 2

The Software Engineering Institute, part of Carnegie Mellon University, and the organization that comprises CERT, offers an Advanced Incident Handling (AIH) course that I am currently attending.

SEI Advanced Incident Handling - Day 1

The Software Engineering Institute, part of Carnegie Mellon University, and the organization that comprises CERT, offers an Advanced Incident Handling (AIH) course that I am currently attending.

Envisioning Perspective

In order to properly assess the security posture of any organization it is essential to first make sure you can accurately gauge the landscape. Taking stock of your assets is the first step to determining any security plan. First you have to inventory what you've got before you can begin to protect it. Discovering what servers, applications, services, devices, databases, and other assets exist within your organization can be a challenging and daunting task in and of itself. In fact, this task is often so difficult that it is a roadblock to a mature information security plan. Auditing your systems takes time and effort that is difficult to justify when the daily crisis of security are competing for attention. This step, however, is the cornerstone of a good information security plan and is absolutely essential to your success.

Educause Security 2009

I'm currently attending Educause Security 2009 in Atlanta, GA. This year's Educause Security has, in my opinion, the strongest program in quite some time. There is a very heavy technical bent to many of the presentations and a lot of the fluff that I found present in last year's conference is absent. The persistent theme of this years conference seems to be PII, identity theft, notification, and privacy. It's interesting to see a security conference so heavily focused on privacy, but identity theft is the intersection of privacy and security.

Disturbing Decision by US Courts Regarding Encryption

Sebastien Boucher was arrested at the Canadian/US border crossing for having child pornography on his laptop has been ordered to reveal the password to decrypt an encrypted drive on his laptop for inspection by a grand jury. The laptop was equipped with PGP industry standard encryption software. However, it seems that when agents first inspected the laptop it was simply asleep (only entering hibernation will cause PGP to re-encrypt drives and require a password) and they were able to inspect contents of the hard drive. Agents apparently found thousands of pornographic images (surprise) and a video titled "2yo getting raped during diaper change" and arrested him on child pornography charges. However, after Mr. Boucher's arrest, it seems his laptop was powered off, which caused PGP to re-encrypt the drive containing the images and video in question.

Drupal Security Team Ignores Multiple XSS Vulnerabilities

The Drupal security team recently released SA-CORE-2009-002 ( which is an advisory that warns that many CCK ( based modules contain Cross Site Scripting (XSS) vulnerabilities that can be exploited by users with 'administer content types' permissions (some examples include the Links module (, the ImageField module ( and the Viewfield module( and the . The Drupal security team's rather disappointing advice to rectify this situation was not to fix the vulnerabilities in the module code in question, but rather to limit the scope of users granted 'administer content types' privileges. This response is flawed in several ways.

Web Application Security

In the latest Silver Bullet podcast Gary McGraw makes mention of the fact that he feels that web application security is attracting too much attention these days. In some ways I feel this observation is right, but in many ways I feel that it is dead wrong. In his book 'Hackers and Painters' Paul Graham makes a very compelling argument that most software should be available via the web. The idea is that most users don't really care about their platform, and providing software as a service frees users from all sorts of headaches. For instance, users don't have to upgrade their web based software, they don't have to worry about hardware requirements, they don't have to hassle with DLL's or anything like that.

Developing Security with Metrics

It is a professional hazard in security to become stuck in a reactive stance, always running to put out the latest fire. Many security personnel find themselves in this mode and cannot seem to escape it. It is important, from time to time, or especially in the case that it has never happened, to stop and take stock of an organization as a whole. No matter how pressing the issues of the moment seem, it is critical to examine your organization from the top down in order to develop, and maintain, an effective information security program. While this sort of planning can seem like a waste of time when the very real threats are battering down the proverbial door of your defenses, it is critical to take a measured approach to your security response in order to be effective, especially with limited resources. The first step to achieving this goal is to gather effective intelligence, specifically having accurate monitoring systems and incident reports.

Pen Tests are Bullshit

Recently I've spotted an increasingly tractable argument against pen testing emerging in the computer security industry. Articles such as Problems with Penetration Testing and Tenable Network Security�s CSO Marcus Ranum's talk in Risky Business #85 are widening the dialogue about the issue. Having just returned from InfoSec Institute's Ethical Hacking training I feel pretty close to the issue. Much of the InfoSec Institute training is designed to prepare people to enter the pen testing field and so I basically spent a week observing the industry from within.

InfoSec Institute Ethical Hacking Day 4 & 5

I've just finished InfoSec Institute's Ethical Hacking class ( The last two days were so hectic that I didn't even get a chance to blog about them as I would have liked. Day four went from 8:30 until 6:30, after which we took the CPT (Certified Penetration Tester) exam so we weren't done until about 8. The EC-Council Certified Ethical Hacker (CEH) exam was scheduled for day 5 at 10 AM so we all left the class exhausted, but I went back to my room to study some more. The content of day four was intense, covering topics from web application attacks (SQL injection, cross site scripting, etc.) to sniffers, deep target penetration, and wireless security.

InfoSec Institute Ethical Hacking Day 3

Day three of ethical hacking didn't end until about 7 PM and with the CPT exam scheduled for the end of day four I didn't get a chance to blog. Instead I went back to my room, studied for a bit, and fell asleep. The course is nothing if not exhausting. Day three was another whirlwind. We covered everything from buffer overflows to privilege escalation. The day's slides went from how to break into a server using tools like Metasploit or Canvas, to privilege escalation, to installing backdoors, trojans, and rootkits. The material was far ranging and indepth. The labs covered deploying a rootkit, using Metasploit, information leaking via SUID root bugs and password cracking. While previous days had covered reconnaissance and information gathering, day three was definitely focused on active attack.

InfoSec Institute Ethical Hacking Day 2

I've just finished the second day of InfoSec Institute's Ethical Hacking class ( and the breakneck pace has not let up. Day two went from 8:30 AM until well after 6 PM. The firehose of information did not slack one bit, covering new topics, labs, and exercises. While the pace is intense, the information is all good, and I barely noticed the time flying by.

InfoSec Institute Ethical Hacking Day 1

I've just finished the first day of InfoSec Institute's Ethical Hacking class ( I'm going to try and write a blog entry each day to keep up with what is going on and provide an overview - if I'm able. This may turn out to be a Sisyphean task, however. Tomorrow is election day and I'm going to stay up and watch the results even if it kills me, because I think this election will be as much a defining moment as 9/11 was in my life - but I digress. I'm also challenged with taking CIT 591 at the University of Pennsylvania, which includes weekly assignments, from which I've been given no respite despite my training, so I'm literally doing all computers all the time. Added to which my carpal tunnel is acting up and, well, you get the idea.

Responsible Disclosure?

I recently had another occasion to make a full disclosure and was chided by some of my colleagues for doing so. Many thought I shouldn't make a vulnerability announcement to a public list. I assume they felt that working with the vendor to fix the issue was a more responsible course of action. In this particular case the personal information of organizational members was being leaked through a conference registration application. While I understand the desire to work with vendors to fix problems before "responsible disclosure" I continue to disagree with the practice in most situations.

The Economy and Information Security

The internet security blog Security Aegis has just published an article, distilled out of interviews with some industry professionals, concerning the state of information security and the economy. As one of the interviewee's for the piece I am of course biased, but I find it to be an excellent piece. It's interesting to note the commonalities between responses to questions about the field and the future. While these may not necessarily lend authority to the prognostications of the contributors, it certainly provides a valuable touchstone for the sentiment of those involved in the profession. The "conventional wisdom" may not provide an accurate roadmap for the future, but is a great indicator of how people are feeling now.

Copyright Infringement (a.k.a. Google Sucks)

Today I ran across a case of someone blatantly republishing my content without consent or approval. They're publishing the content on Google's I figured it would be a simple matter of notifying Google and getting the content removed. Unfortunately, in their infinite wisdom, Google makes reporting copyright violation a royal pain in the ass! You have to actually print out a letter, include very specific language, and mail or fax the letter to them. The whole process reminds me of the hassle you used to have to go through to register a domain name.

Full Disclosure

There has been a lot of debate over the years about full disclosure. This is the practice whereby security researchers publish their findings to the world. It's a thorny debate and my opinions have changed over time. I've been the subject on the receiving end of a vulnerability disclosure in the past. In that case I only noticed the disclosure because I follow security announcements.

Undeniable Deniable Filesystems

In a new paper published on Bruce Schneier's website (, researchers examine deniable file systems (DFS). The paper specifically focuses on DFS as implemented by TrueCrypt 5.1 and finds several severe limitations imposed upon DFS by regular data usage.

Captcha Cracking

CAPTCHA (Completely Automated Public Turing test to tell Computers and Humans Apart) is a widely used verification system that forces users to look at images of obscured text and enter the text into a field. This system was designed to defeat automated computer based systems that were often used by spammers to set up bogus accounts or send spam. The idea was that the images weren't machine readable and Optical Character Recognition (OCR) technology wouldn't be able to decipher the image thereby defeating automated tools that spammers used. This raised the bar significantly for spammers. Many turned to micro payments, enlisting humans to decipher CAPTCHA code for a small fee. This isn't nearly as effective as using a computer though and both academic researchers and spammers alike have been searching for programmatic ways to defeat CAPTCHA, even as the technology evolves.

Is Security Certification Worth it?

A host of computer security certification exist, covering quite a range of topics. At some point in every security professionals career they look at certification and begin to weigh their value. I have given my own take on certification some thought and come up with the following recommendations based on my own experience. Certification hinges on two things, the test and the certifying body. Certification, in the end, stands as independent verification that you passed a test. The test criteria and the respectability of the certifying body determine the value of the test to others. Personally, when I interview someone I don't give a second look at the certifications they have. I look for experience that proves the assertions the certifications make. Proving you can apply knowledge that a certification tests is much more difficult than just getting a certification.

SanDisk Sansa Clip Annoyances

I recently purchased a 2GB SanDisk Sansa Clip MP3 player. It showed up and looked great. Problem was I plugged it into my XP laptop and it was recognized as a new USB device, but it wouldn't show up in my 'My Computer' menu. I purchased this particular model MP3 player precisely because it didn't need any extra synchronization software - you just drag mp3 files onto the device drive and they load on the player (supposedly). After searching online for some time I finally found the solution to my problem at I'll repost the fix here in case anyone else has trouble finding a solution to this problem.

DNS Debacle

Most people are probably blissfully unaware, but security researcher Dan Kaminsky discovered a very serious flaw in DNS (Domain Name System) and was waiting until Black Hat to release the details. Well, as with all secrets, if more than one person knows it, soon it's not a secret. The vulnerability was announced two weeks ago by US CERT and major vendors have been working to apply patches. However, most of the public was unaware of how the vulnerability would affect them, or how an exploit would work.

The New School of Information Security

As promised here is my full review of The New School of Information Security by Adam Shostack and Andrew Stewart: The New School of Information Security is one of the most timely and radical books on computer and information security that I've ever read. Adam Shostack and Andrew Stewart help to stimulate a significant paradigm shift that has been brewing in the infosec sphere for some time. With solid evidence and well grounded arguments Shostack and Stewart advocate for a new, and much needed, approach to information security: the New School.

ablog_Update your Drupal Instance

The Drupal team released a critical announcement today advising that all users update their Drupal 5.x and 6.x installations. Several vulnerabilities exist within the Drupal core that could be used by remote attackers to exploit cross site scripting (XSS), session fixation and SQL injection vulnerabilities. Because it doesn't take attackers long to reverse engineer exploit code after a patch is released (see it is important to upgrade your Drupal installation as soon as possible. The full text of the announcement follows and can also be found at ------------SA-2008-044 - DRUPAL CORE - MULTIPLE VULNERABILITIES------------ * Advisory ID: DRUPAL-SA-2008-044 * Project: Drupal core * Version: 5x, 6.x * Date: 2008-July-9 * Security risk: Moderately critical * Exploitable from: Remote * Vulnerability: Multiple vulnerabilities ------------DESCRIPTION------------

Identity Protection

If you've ever done a Google search for your name you'll be shocked at how much information comes up. There are customer profiles on commerce websites, your profile on social networking sites, heck, perhaps even the deed transfer information from when you bought your house. Of course, we all want our friends to be able to find us online, but often times too much information about who we are gets leaked onto the internet. I'm fine with people finding my e-mail address, but finding out where I work, where I live, my phone number and my Amazon wish list is a little too much for me. There are even new sites like that do deep searching and pull all these details our for any casual searcher.

Get with the New School

A recent post on the Tao Security Blog got me thinking about what I feel is probably the most important book on computer security in the market today. Whether overt or by influence, this book is making waves in the computer security industry and hopefully changing things for the better. In the case of the Tao Security Blog it seems that Richard Bejtlich borrows directly from the book. In fact his entire post appears to be a synopsis of Chapter 3. Bejtlich swears he hasn't read the book - which for me is just further evidence of how accurate the book is in reflecting emerging trends and new philosophies evolving in computer security.

MediaDefender DDOS of Revision3

There's a very interesting write up of the recent denial of service attack against Revision3 on the company's blog. For those who aren't aware, this high profile attack hit the news with ferocity when it was discovered that the company MediaDefender, which works to stop illegal file sharing and has done work for organizations like the RIAA, was the culprit in the attack. Revision3 was using BitTorrent for perfectly legitimate reasons and MediaDefender crippled Revision3's internet connection over the Memorial Day weekend. Of course, a lot of questions arose immediately following the attack. People wondered if it was a mistake, or perhaps a misconfiguration. Denial of service attacks are illegal, and for one US company to carry one out against another is pretty serious business. It turns out that Revision3 has contacted the FBI, who are investigating.

Lets Go Phishing

While reading the F-Secure blog today I came across an interesting service that I hadn't know about before. PhishTank ( is a service that allows you to submit suspected phishing sites and tracks their status. With an open API, PhishTank even lets you write tools to query their data. This is a really neat development. It's about time that phishing sites faced the same sort of scrutiny that e-mail has in the past with sites like Spamhaus ( Unfortunately that sort of scrutiny led spammers to utilize infected end users systems rather than open e-mail relays or compromised servers. With botnets providing much of the SMTP service these days it isn't feasible any more to block specific sender IP addresses (with hundreds of thousands of bots, the herders just promote one after another to be an SMTP server until it's blocked, with a nearly inexhaustible pool).

The New Threats in Computer Security

I recently attended the Educause Security 08 conference in Washington, DC. There were many wonderful presentations at the conference and I came away with a lot to think about. One of the trends that seemed to come up over and over again was the changing landscape of computer security. There seems to have been two major sea changes in information security over the last couple of years. These changes must fundamentally alter the way information security professionals regard their charge or face obsolescence.

Here's a Vexing Question

There have been many studies on why phishing attacks are such a problem. In one often cited Gartner study it was reported that 3% of users will give away personal or financial information to phishers. This means that at any given time some 3% of your user base will respond to phishing attacks, no matter how obvious. I have seen this in action several times. Despite numerous warnings to users and prominent alerts online and in print publications that admins will never request passwords, we detect users sending their login credentials to phishers. The emails come in bearing misspellings and horribly grammar and supposedly smart people (we're talking Ivy League educated here) will send off their username and password.

On Multiple Single Factor Authentication

Two factor authentication is fast becoming an industry standard for high value applications. Unfortunately a lot of misunderstanding surrounds two factor authentication, and thus, the implementation is often less than ideal. Two factor authentication, strictly speaking, requires a user to provide information from two separate sources in order to authenticate. These sources can be diverse, but the most common source by far is something the user knows (such as a password). The second factor can be provided by a key fob, a smart card, biometrics, or any number of other sources.

Asshole Hackers

So I started up logging on this site the other day, mostly out of curiosity. I was completely disheartened as soon as I did though. Come to find out some of the most common hits on the site are by people looking to exploit a basedir file inclusion vulnerability. What's worse, this is a vulnerability ( that exists in some of the software I've written and released open source. In any case, these assholes are basically trying to break into my server by exploiting this vulnerability.

Long Time... or Taxonomies in CMS'es

I've been away from the blog for a while (which is bad) because I've been upgrading servers (which is good?). I haven't had much time to devote to personal writing but I figured I'd get back in the saddle now that things on the hardware/operating systems fronts have settled down.

PHP Quebec

So I just got back from PHP Quebec, and although the trip home was horrendous the conference itself was a lot of fun. It is held in the amazing Sofitel Hotel along the 'Golden Mile' in Montreal, just at the base of Parc du Mont-Royal. The conference space was sparse, but attendance was probably under 200 so it worked out well.

Irish Hot Teens

While it's still early for St. Patrick's Day I thought I'd point out an interesting tidbit I noticed from my server logs recently. I currently hold the number one Google result for quite a few random queries, some that I never would have suspected. These top rankings tend to be articles on my website that are concerned with Linux troubleshooting, security, databases, odd programming challenges and so on. Because my website hosts so much eclectic technical data (mainly results of troubleshooting problems that I know I'll never remember - and thus I document), I'm not surprised by many of the results. Among the hit parade are: math in bash shell hack into a website update urpmi database

Damn DST (*yawn*)

Ok, so it's the day for the new daylight savings time (DST to those in the know). This change in the clocks is supposed to save energy, but as two economics students point out ( this may not be the case. One thing the article fails to take into account, which I feel has a rather large impact on the cost savings analysis of DST, is the price that companies have had to pay in terms of IT costs. The cost to develop patches for software and services, the time staff have had to spend devoted to deploying patches, testing systems and insuring that they all function properly. I'm sitting at my desk at a major university and my Cisco 'iPhone' is displaying the wrong time right now. How many IT workers are spending hours, days, or even weeks dealing with this shift? What's the overall loss in productivity due to this redeployment of resources?

ablog_Why the EU Will Always be Cooler than a Mashup

Ok, once again, we return to the topic du jour, defiling mashups. In response to Chris' blog. As a side note I've enable anonymous comments for now, let me know if there are still problems (my opinion of Drupal is declining the more I use it). I'm tempted to take a lot of different avenues in explaining why I would strongly recommend mashups, and there's a strong pull to use anecdotal evidence, but I think I'll stick to straightforward analysis. By this I don't even just mean business analysis, but also engineering analysis.

ablog_Why Mashups Aren't Cooler than PB&J

Mash ups are the latest cause celebre on the internet (now that corporate blogs have cooled off) and I have to say, as a developer I'm not impressed. Now, I'll admit that I'm notorious for having negative reactions as a knee jerk response, but I think "mashups" are just another facade in the internet hype cycle. Of course, it's easy for me to be negative about any new, unproven technology, but mashups aren't anything new. Mashups are derived out of a long and less than illustrious heritage that includes portals, SOAP and remote XML. At its core a mashup is nothing more than a refactoring of remotely available data.

Google Miscounting Page Hits

So I have a script on my website that alerts me whenever a page request is made. This is a piece of PHP code that keeps a count of how many distinct requests were made throughout the course of the day. Today I noticed something *very* curious. When I queried the database storing the information for my hits for yesterday I came up with: 1204 Now this is _very_ interesting because when I log into Google Adsense and check here's what I see: Google Ads ripping me off Note the page impressions. How could they possibly be off by almost 1000 hits? What gives? [Update 2/9/2007]

C and Building a B0x3n

Whoa! What a crazy freakin' weekend. I hate to blog about "regular life" because I find it to be incredibly mundane, but nobody reads this blog anyway so what the heck. Several things occurred to me over the weekend.

ablog_My PDA Fails

I've given a lot of thought to my (less than) trusty PDA of late. I've finally decided that PDA's simply aren't worth it. I have several reasons for this decision, but I think the bottom line is that tomorrow I'm going to go out and buy a small bound book to replace my PDA. 1. Size - My Tungsten E2, along with it's titanium case, is about the size of a small soft cover novel. It weighs a couple of pounds and won't fit comfortably in any of my pockets. This means that I have to carry a bag wherever I go, often times simply for my PDA. I take one look at my wife's iPod Nano and wonder why my device, which has a tenth of the storage space is so huge and clunky. 2. Reliability - My PDA always crashes. Not very often, but over time it always does. Sometimes I don't lose any of my data, sometimes I lose all of it. Regardless it is always a hassle to recover the data.

ablog_Why Backups Don't Work

It's a common adage that every computer user knows by now: "back up your work." And yet, most of us probably don't do just that. Why is it that computer users don't back up their important data? I have several theories. I think the first, and probably most influential reason people don't back up their work is that it's inconvenient. You have to point, click, drag and wait. Essentially it's a pain in the butt, a hassle, and takes time. The thing about this reasoning that baffles me is that this sort of mindless work is exactly the kind of work computers were designed to do. You should be able to give your computer a list of directories and files, flag them as 'important' and have them backed up automatically. There doesn't seem to be any adequate tool available (or cheap, or convenient enough) around to do just that.

ablog_The Problem with PDA's

There is an enduring problem with PDA’s that plagues pretty much all electronic organization devices - half life. I’ve been using a PDA pretty much since they first showed up on the market. I’d rave and rave about how wonderful they made my life, the ability to organize my schedule and keep track of phone numbers was amazing. The problem was, no matter how well I documented material, and no matter how religiously I synchronized my device, the data would be destroyed at some point. I began to think of this as the PDA half life. Over time the volatility of the data seemed to increase until it reached a boiling point at which it would blow up. This half life is up to about a year now, which in my estimation is quite a long time, but losing a year’s worth of data sucks.

Browsers the Killer App?

This post is in response to a blog post on ( While I agree with every aspect of your argument, your premise is wrong. You’re limiting your mindset to your style of work and/or computer utilization. Web browsers do a great job of displaying information and transmitting it over the network. If your workflow involves displaying, creating or transmitting data then a web browser is all you need. However, the browser falls on it’s face when you examine alternative tasks.

Relational Filesystems

I just began a project that I think might hold a lot of promise for me. I’m working on an online document storage application. In a nutshell, documents can be uploaded into a database and classified for sharing with other users. It’s a fairly straightforward use of a web based database application, but it got me thinking.

ablog_Why Use an Open Source Project

Over the years I’ve given a lot of thought as to why people should invest in open source solutions. I’ve come up with a lot of reasons, some good, others just smarmy. Some of the reasons for choosing open source, however, may be a little less obvious than others.

The biggest reason I like to suggest open source software is that you own the code. This is different from owning a copy of the compiled program. Owning the code means you can change it and modify it in any way you see fit. These changes aren’t just limited to cosmetic customizations either. When you use open source code you can make changes all the way down to the core of the application. This gives ownership unparalleled in the commercial software world. It also means that over time if you want to make upgrades or customizations, these won’t involve any additional purchases.

ablog_The Army (of Developers) You Have

If Donald Rumsfeld is to be believed, and you go to war with the army you have, then there are specific challenges to technical projects that must be addressed proactively. Before you even begin a project there are questions to consider concerning your resources, training, and expertise. In any given project you will usually begin with a problem statement. From this problem statement you begin to deduce the course of your proposed solution. This evolution should include an evaluation of resources and skills, but far too often it does not.

Oracle vs. Red Hat, Un-Believable?

The heat is on.

Right now it looks like there’s a major war shaping up in the Linux community, and this war is set to be rather different from the other ones that have occurred. You see, this time the fight isn’t about obscure file system structures or project forks, this time it’s about money. And the people fighting this fight aren’t geeks with duct taped glasses, this time the people kicking up a fuss are all in suits.

Skin Deep is Good Sometimes

Interestingly, an old debate is resurfacing in the Linux community with the recent posting on Mark Shuttleworth’s blog. For those that don’t know, Mr. Shuttleworth is the luminary behind Ubuntu (and the many other *buntu’s) Linux. His company, Canonical, foots the bill for the development of a rather revolutionary new distribution. It is available completely free, and you can even order a free CD sent to you without charge. Their motto at Ubuntu is “it should just work” and by and large their distribution doesn’t disappoint. It is probably one of the best constructed distributions I have ever installed. This is no mean feat. Many Linux companies produce two versions of their distribution - the free version, and the commercial version.

If a Vulnerability Falls in the Forest

If a vulnerability is discovered in your codebase, but it's not exploitable, does it make a sound? I recently ran into this dilemma while investigating an add-on module to a popular open source package. The module appeared to have a vulnerability, but upon further investigation, I discovered that the code base of the package, rather than the module, contained the vulnerability. It turns out that you could mitigate the vulnerability by adjusting the module code to prevent it, but the underlying vulnerability still remained. The question then became - should this vulnerability be fixed?

Open Document Formats for All

Using the open document formats ensures that no matter what changes happen to your word processing program, your data will never get locked away in a proprietary format.

Drupal Content Access Module 6.x-1.1 XSS

he Content Access Module suffers from a cross site scripting vulnerability because it does not sanitize role names before displaying them on the 'Access Control' screen of managed content types. This vulnerability is exacerbated by the fact that Drupal 6.12 core does not perform input validation on role names as they are being created. This can lead to a situation where users administering role based access controls of content types could be exposed to malicious HTML content.

Security Researchers in the Open Source Ecosystem

In his book Geekonomics, David Rice does a great job of quantifying and analyzing the costs of insecure software. Insecure software is everywhere and it creates staggering problems, not just for the enterprise, but also for individuals. One of the great causes of insecure software is the fact that not enough testing is done before software is released. Often times developers rely on end users as product testers, reporting bugs as they find them. In the rush towards the latest and greatest features debugging and testing are often overlooked, resulting in security flaws.

Analysis of the RoundCube html2text Vulnerability

RoundCube Webmail Project ( is a popular, AJAX enabled web based email client. RoundCube utilizes a user friendly interface and allows users to access IMAP email via their web browser. RoundCube is written in PHP with a MySQL database and utilizes XHTML and CSS. RoundCube's intuitive interface and advanced UI functionality have made it a popular open source webmail client. Unfortunately RoundCube versions 0.2-3 beta and 0.2-1 alpha were found to contain a critical flaw that allowed remote attackers to execute arbitrary code with the privileges of the web server (CVE-2008-5619). This article examines the root causes of that flaw and it's implications.

About Identity Theft

Identity theft is a common topic in the media and in reality these days. So common in fact that the FTC has set up a website to help highlight the problem and provide details to the public. Identity theft is often closely tied to information security but many people don't understand why. Every time you fill out a form for a credit card, or a customer appreciation club, or even at a doctor's office, you're entering very personal and identifiable information. All that data usually ends up on a computer at some point. If that computer is compromised, then an attacker can steal those details.

What is Fast Flux Hosting?

Fast flux hosting (or fast-flux service networks), commonly utilized amongst malware bot herds and spammers, is a method used to hide servers or content behind an almost dynamic domain name. This allows attackers to keep content online and avoid a single point of failure. Traditionally, once a malicious host is detected, and ISP can be contacted and the machine can be pulled offline. This means that phishing sites or bot command and control machines could be pulled down as soon as they were identified.

botHunter Released

I've been reading about botHunter, which is a recently announced free bot net detection utility. botHunter is a new system designed by researchers at the Georgia Institute of Technology and the Computer Science Laboratory of SRI International. It is an interesting approach to detecting bot infection in local networks.

On Multiple Single Factor Authentication

Two factor authentication is fast becoming an industry standard for high value applications. Unfortunately a lot of misunderstanding surrounds two factor authentication, and thus, the implementation is often less than ideal.

Now is the Time to Update Your Firmware

Embedded devices such as home routers are increasingly finding themselves in the cross-hairs of the black hat community. Because most users plug in these devices and forget about them, newly released vulnerabilities are unlikely to be addressed. The end user can hardly be blamed for this situation given the difficultly of applying updates and the inability of most embedded devices to communicate to users. Due to the evolving needs of black hats to compromise targets with reliable internet connections to commit computer crimes, embedded devices make excellent targets. Without proper updates the cheapest component on your network could easily become the most dangerous to you.

Social Engineering via Social Networking

New social and business networking sites allow users to connect to colleagues and old friends. These connections, however, are visible to potential attackers. Leveraging known networks of trust and attacker could invoke relationships the victim has to third parties in order to piggy-back off of that trust relationship. By providing details to a networking site you could be making a social engineering attack much easier to pull off.

Buying a Computer

My own suggestions for the deciding what you want in a new computer and where to find one.

Holy Klez Batman!

A short advisory and examination of the Klez virus including links to cleaning tools and futher information.