Friday, April 30, 2010

Submitted to Black Hat and Defcon - Forensic Methodology

Whew, done with that.  Sam and I have submitted our Digital Forensics Methodology presentation to Black Hat and Defcon and we are looking forward to a Vegas trip.  Vendor parties, fantastic presentations, booze, and gambling are coming our way.  I am not sure we will get accepted to Black Hat based on going for a bunch of years and knowing their program, but why not.  Should be fun and I have contributed lately.

Another project that is poking up, is that I finally got a update on my CDROM project.  Broken and slashed CDs involve getting structural integrity, clearing the media for read, and then the right software the multiple errors, and my project manipulating the cdrom with ATA command set and drivers to 1x speeds and slower.

Hey - ask me questions, I'll do my best.

Monday, April 26, 2010

Forensic Tools - Constant debates

I get this every now and then - "what tools do I use"?  Meh, of course I am more about the process and using the right tool for the job, BUT I recognize the familiar tool bias (you like what you know) and personal bias towards the way I like to approach problems.  I like to check work from multiple tools and note that in my summary of findings.  So I am answering this from the primary tool perspective.  With that said, here I go on about the overall forensic tool kit.

   Overall forensic tool kit - X-ways Forensics, combined with the $199 version of DTSearch.  I used to be almost 100% Encase, then migrating to Access Data FTK, but now mostly X-ways.  I feel it is as flexible as it can be, I don't have to do a monolithic import to get thing going.  I just mount the image read-only and start the DTindex and open in X-ways and start processing.   I have been using Access Data FTK as the backup and when I have multiple cases that need processing at the same time, and I still check my work with Carrier's Sleuth Kit.  I believe I use tools fairly agnostic, but I just have not needed to reach back to my older version of Encase. Also, I never get into flame wars about how your choice rocks and everyone else is bad - I just but things in a category of the good and bad parts of using whatever tool you are talking about.

SIFT workstation, I love version 2.0 and have been warming to the idea of VM forensic kits with shared folders that allow to use the combination of win32 and *nix tools without large copy times or loading external drives.

Oh, running late - follow up later

Friday, April 23, 2010

Rootkit Dissection

Following 'some links I came across a pretty good article of a rootkit dissection which totally fall into the category of stuff I like to read - the process used to develop information.  Knowing me personally, I frequently drone on about the "how" to process things and using critical thinking to solve issues and fully understand the problem | incident | root cause.

If you are not familiar with critical thinking, I believe it is the foundation for being successful at solving open-set solutions - solutions that have many methods of deriving the solution with various degrees of success such as digital investigations, hardware troubleshooting, or simply fixing your windows installation problem.

I recommend reading the "simply fixing your windows installation link", uhh it has in-depth troubleshooting and Sherlock Holmes quotes.

I think both links show the use of critical thinking and understanding the logic associated to solving complex issues.  I'll post some of my favorite moments in troubleshooting, both good (solved) and bad (made an ass out of myself).

Note, I have not read any of these books so check the reviews.  I had read books and written papers in college on critical thinking, but I really learned it from a dude named Garth, a mentor I had in high school.  He incorporated critical thinking with philosophy and social behavior and I still vividly remember some of the conversations we had 28 years ago.  Here's to you Garth, I am glad you refused to buy beer for underage kid and instead changed my life.

  

Wednesday, April 21, 2010

Working Hard

Way to much going on - I've started a mandatory security training program and am spending a large portion of my week presenting and (hopefully) empowering individuals to make intelligent security decisions.

I've been working on Defcon presentation with a forensic methodology that is representative of the on-the-job training I give my forensic specialists.  Here is the abstract that I have assembled that help clarify the issues that am looking to improve upon.


A new approach to Forensic Methodology and !!BUSTED!! case studies.
Imagine the following experiment, a unique case is given to three digital forensic analysts and each is given the opportunity to engage the requester in order to develop the information needed to process the case.  Based on the information gathered, each of the three analysts is asked to provide an estimate to complete the investigation and can proceed with up to 20 hours to process the case.  The analysts are then measured based on the total findings, the time required to process the case, the initial information gathered, and the estimated time to process the case.  The expected result is to be varied based on experience and individual characteristics, such as organization, discipline, and the attention to detail of each analyst.  Imagine this same experiment but with only 8 hours to process the case, because that is the way it happens in real life.

David Smith and Samuel Petreski have developed a methodology that fits within the Analysis phase in one of the standard Digital Forensic Analysis Methodologies - PEIA (Preparation, Extraction, Identification, and Analysis), to provide a structure for consistent results, better development of the requested goals, increase efficiency in fulfilling the goals, and develop an improved estimate of the time required to complete the request.

This methodology involves the generation and validation of case goals, the evaluation of methods used to achieve the goals, a structure for estimating the effectiveness, time required, processing results of specific methods, and generalized organization and time management.  The primary goal of this methodology is to address the structure and optimal path that would allow a digital forensic examiner to perform an examination with a high level of efficiency and consistent results.

This presentation provides an introduction to this methodology and applies its key concepts to real sanitized digital investigations, such as tracking down a suspected executive's adult craigslist ad, performing an analysis on a compromised system involving social security numbers, and making the determination of intellectual property theft.

Should be fun.  BTW, I love this book:

Monday, April 19, 2010

Super Timeline - Rob Lee & crew

So now you have probably heard about Super Timeline from Rob Lee's SAN page - http://blogs.sans.org/computer-forensics/2010/03/19/digital-forensic-sifting-super-timeline-analysis-and-creation/.

Good stuff, I don't know if you had tried log2time, but my first thought was wow, it would be great if it could go and find all of the artifact log files.  Well, they did that too - TimeScanner was added to search the drive and send the output to log2time, sweet!  Combined in the future is what I have read.

Rob Lee combined this with Carvey's registry time perl script, fls from the Sleuth Kit, and jacks it all together with old school mactime.pl, also from the SleuthKit.  Rob makes putting this easier by having these components in the SIFT Workstation (info found in the Super Timeline page).

Ok, intro done...  I have run some tests against older cases and loved the results*.  It USED to be a lot of work to get log source 1 and  log source 2, consolidate them, and review.  Then make a determination if log source 3 was needed.  This makes it much quicker and moves it up the SP index (SPI) appropriately, since the SPI is a combination of factors, including estimated time and estimated effectiveness in meeting the case goals.  SPI is an artifact from the digital forensic methodology I have been teaching my staffs and formalizing into a presentation for Defcon 18 and Black Hat 2010

*For legal purposes - I didn't find any data that changed any of my conclusions, but enhanced the  conclusions that I or my teams generated.

Sunday, April 18, 2010

Defcon 18 and Black Hat

Anyone else getting excited (although it is really early) for Defcon 18 and Black Hat in Las Vegas?  I am!  I'm also working on a presentation with Sam Petreski on a new approach to forensic methodology, which I feel is really interesting.  I haven't presented at Defcon or Shmoocon in a couple of years, although I have good stuff that I am working on.  FOLLOW-THRU!

It gets away from the classic framework methodologies like:
Classic DOJ PEIA
Classic B Carrier
Fantastic whitepaper comparing methodologies
(Unfortunatley all PDF's so be careful with your patches)

But instead focuses on the analysis phases of the forensic specialist, from the initial information gathering to preparing to develop the report - i.e. when the "man" sits in front of the forensic PC loaded with tools and the images to examine.

Here is the abstract which helps develop the methodology.

A new approach to Forensic Methodology and !!BUSTED!! case studies.
Imagine the following experiment, a unique case is given to three digital forensic analysts and each is given the opportunity to engage the requester in order to develop the information needed to process the case.  Based on the information gathered, each of the three analysts is asked to provide an estimate to complete the investigation and can proceed with up to 20 hours to process the case.  The analysts are then measured based on the total findings, the time required to process the case, the initial information gathered, and the estimated time to process the case.  The expected result is to be varied based on experience and individual characteristics, such as organization, discipline, and the attention to detail of each analyst.  Imagine this same experiment but with only 8 hours to process the case, because that is the way it happens in real life.

David Smith and Samuel Petreski have developed a methodology that fits within the Analysis phase in one of the standard Digital Forensic Analysis Methodologies - PEIA (Preparation, Extraction, Identification, and Analysis), to provide a structure for consistent results, better development of the requested goals, increase efficiency in fulfilling the goals, and develop an improved estimate of the time required to complete the request.

This methodology involves the generation and validation of case goals, the evaluation of methods used to achieve the goals, a structure for estimating the effectiveness, time required, processing results of specific methods, and generalized organization and time management.  The primary goal of this methodology is to address the structure and optimal path that would allow a digital forensic examiner to perform an examination with a high level of efficiency and consistent results.

This presentation provides an introduction to this methodology and applies its key concepts to real sanitized digital investigations, such as tracking down a suspected executive's adult craigslist ad, performing an analysis on a compromised system involving social security numbers, and making the determination of intellectual property theft.


Interested in more?  Here are some books Amazon recommends - don't worry, I won't put a book out that I haven't read and think is worth it.

Don't ever get called into a deposition or court without reading this book!  It is the bomb-bay and after understanding the concepts, it will be you go-to-book when you have a follow-up.










Uh, yea.  Another must read.  I have it on my kindle now for quick(er) reference as well.  I seem to absorb more each time I read it and really like the 2nd edition.

Saturday, April 17, 2010

How can you not love Bruce Schneier's CryptoGram?

I am not sure if you are getting this newsletter, but always a thrill-a-minute.  I can't say I go with him on all of his positions, but always thought provoking and well thought out.

http://www.schneier.com/crypto-gram-1004.html - link.

Great this month is his analysis on the story of "Facebook Chief Executive Mark Zuckerberg declared the age of privacy to be over".  Yes, I am pretty tired of hearing "users are idiots", so it was refreshing to have a position article on that the moneymakers of the world are working really easy for you to lose your privacy.

Perl script for renaming music files

Ok, I used one of the free ipod backup programs and it copied all of the file with native format, e.g. UBOT.m4a - great, now I have to write a perl program to read the metadata and rename the files.

Quick and dirty, but here you go:

 #!/usr/local/bin/perl
use MP4::Info;
use MP3::Info;

MP3s();
MP4s();


sub stripUnwanted
{
my $filename=shift;
    $filename =~ tr{\\\/}{-};
    $filename =~ tr{*?}{X};
    $filename =~ tr{“><[]|:;,’=\"}{_};
return $filename;
}

sub MP3s {
@files = <*.mp3>;

    foreach $file (@files) {  
        my $tag  = get_mp3tag($file);
        $artist = $tag->{ARTIST};
        $title = $tag->{TITLE};
  
        $artist = stripUnwanted($artist);
        $title = stripUnwanted($title);
  
        print "Artist: $artist - Title: $title\n";
        rename ($file, "$artist-$title.mp3");
    }  
}

sub MP4s {
@files = <*.m4*>;
  
    foreach $file (@files) {  
        my $mp4 = new MP4::Info $file;
        $artist = $mp4->artist;
        $title = $mp4->title;
  
        $artist = stripUnwanted($artist);
        $title = stripUnwanted($title);
  
        print "$artist - $title\n";
  
        rename ($file, "$artist-$title.m4a");
  
    }
}

Saturday, April 10, 2010

Back on track - working on lots of projects..

I'm back and have lots of good info to share.  I am working on my Defcon presentation and better password dictionary development.

That's it for now, but good stuff coming.