Wow, time has flown by... The presentation went great! Sam and I were really amazed to see the connection to our presentation by lots of folks at DEFCON. Conference was a bit crowded and hard to see the presentations that you wanted to see, but you have probably already read all those reports.
Here is the presentation that Sam and I presented on July 30th. Please feel free to email or let us know your thoughts!
Defcon 18 Presentation - DC Smith Sam Petreski - Forensic Methodology
Thursday, August 5, 2010
Saturday, June 5, 2010
It is official...
Posted @ defcon.org today:
DAVID C. SMITH GEORGETOWN UNIVERSITY AND HCP FORENSIC SERVICES
A NEW APPROACH TO FORENSIC METHODOLOGY - !!BUSTED!! CASE STUDIES
DAVID C. SMITH GEORGETOWN UNIVERSITY AND HCP FORENSIC SERVICES
SAMUEL PETRESKI GEORGETOWN UNIVERSITY AND REMOTE IT CONSULTING
Imagine the following experiment, a unique case is given to three digital forensic analysts and each is given the opportunity to engage the requester in order to develop the information needed to process the case. Based on the information gathered, each of the three analysts is asked to provide an estimate to complete the investigation and can proceed with up to 20 hours to process the case. The analysts are then measured based on the total findings, the time required to process the case, the initial information gathered, and the estimated time to process the case. The expected result is to be varied based on experience and individual characteristics, such as organization, discipline, and the attention to detail of each analyst. Imagine this same experiment but with only 8 hours to process the case, because that is the way it happens in real life.
David Smith and Samuel Petreski have developed a methodology that fits within the Analysis phase in one of the standard Digital Forensic Analysis Methodologies - PEIA (Preparation, Extraction, Identification, and Analysis), to provide a structure for consistent results, better development of the requested goals, increase efficiency in fulfilling the goals, and develop an improved estimate of the time required to complete the request.
This methodology involves the generation and validation of case goals, the evaluation of methods used to achieve the goals, a structure for estimating the effectiveness, time required, processing results of specific methods, and generalized organization and time management. The primary goal of this methodology is to address the structure and optimal path that would allow a digital forensic examiner to perform an examination with a high level of efficiency and consistent results.
This presentation provides an introduction to this methodology and applies its key concepts to real sanitized digital investigations, such as tracking down a suspected executive's adult craigslist ad, performing an analysis on a compromised system involving social security numbers, and making the determination of intellectual property theft.
David C. Smith works as the CSO for Georgetown University and a co-owner of HCP Forensic Services providing information security programs, digital forensics, and expert witness testimony. He has been in the technical field for over 20 years and enjoys engaging in complex technical problems.
Samuel Petreski works as a Senior Security Analyst for Georgetown University and an owner of Remote IT Consulting. Samuel has worked mostly in higher-ed focusing on network architecture and administration, as well as building and administering scalable network security solutions. He posses over 10 years of experience in the IT field working in very diverse environments.
David Smith and Samuel Petreski have developed a methodology that fits within the Analysis phase in one of the standard Digital Forensic Analysis Methodologies - PEIA (Preparation, Extraction, Identification, and Analysis), to provide a structure for consistent results, better development of the requested goals, increase efficiency in fulfilling the goals, and develop an improved estimate of the time required to complete the request.
This methodology involves the generation and validation of case goals, the evaluation of methods used to achieve the goals, a structure for estimating the effectiveness, time required, processing results of specific methods, and generalized organization and time management. The primary goal of this methodology is to address the structure and optimal path that would allow a digital forensic examiner to perform an examination with a high level of efficiency and consistent results.
This presentation provides an introduction to this methodology and applies its key concepts to real sanitized digital investigations, such as tracking down a suspected executive's adult craigslist ad, performing an analysis on a compromised system involving social security numbers, and making the determination of intellectual property theft.
David C. Smith works as the CSO for Georgetown University and a co-owner of HCP Forensic Services providing information security programs, digital forensics, and expert witness testimony. He has been in the technical field for over 20 years and enjoys engaging in complex technical problems.
Samuel Petreski works as a Senior Security Analyst for Georgetown University and an owner of Remote IT Consulting. Samuel has worked mostly in higher-ed focusing on network architecture and administration, as well as building and administering scalable network security solutions. He posses over 10 years of experience in the IT field working in very diverse environments.
Saturday, May 29, 2010
What's Up?
Been working a lot, getting very excited about our Defcon 18 presentation and tool release. We haven't heard back from Black Hat, but always thought it was a really long shot. A better description is that it is part methodology, part process, and part expert system.
The primary goals are to improve the initial request, develop better agreed upon investigation goals, and improved time estimation. Then in the analysis phase, better guidance in choosing the optimal methods and a structure for time management. Fun, eh?
Between that and wrapping up about 6 cases with my two teams (HCP Forensics and GU InfoSec) it has been a crazy couple of weeks. I still have complete the last two rounds of my testing for the Tableau TD1 as well!
Saturday, May 15, 2010
Tableau TD1 Forensic Imager Initial Review
Yea, I finally got paid from wrapping up a case, worst was 90+ over due and best was 45+ days over due BUT the profit from these cases was earmarked to purchase new equipment. The first purchase was from ForensicPC.com and was the Tableau TD1 Forensic Imager. I priced it around and found I could have shaved $20 from the total price, but I had to wait on a full quote from a site that didn't have a online cart. I also purchased it with the Pelican 1450 case (other sites had a mark-up, but free case).
Forensic PC ordering process was just okay, I submitted on a Saturday after depositing the check and they processed the order on Monday. I got an email stating that I went from order received to paid, but then didn't hear anything for 8 days. I wrote a note about the status and got an apology email saying that I should have got a message (maybe spam filtered) telling me about the delay on the TD1 and the case. Since I filter spam and not delete, I checked and there was no message. I did get emails on the ship status and tracking and it arrived yesterday - Whoo-hoo!
Ok, enough overhead on the story. I unpacked and inventoried everything and was impressed with the unit size and features. I had previously used the Voom HC II and noticed a few differences that what I was used to. First, speed. I ran it through some testing (full output spreadsheet to come when complete) and the speed was impressive at 6GB+ on my equipment with MD5 and SHA1. My initial tests were mostly functionality and not to quantify the speed but happy right away with the overall speed with SATA disk to disk, disk to file, and wipe. Second, I like the setup and input of examiner and case info. I thought it might suck with slow typing but since I am used to IPhones it was that bad (I read that you can use a USB keyboard, but that is a future test).
Now a little of the not-thrilled-about / maybe-getting-used-to. Voom HC2 had NTFS format and could create a full size disk-to-file, e.g. 80GB drive to a 80GB file. Sure it had a funky thing with once you mount a Voom HC2 NTFS drive on any system it was not recognizable by the Voom again, but I like having large files without a follow up conversion. TD1 can create FAT32 formats and the underneath structure of the TD1 seems that it is based on "chunks" and configuring the size of the chunks. I processed some images and am not sure that it will be a big deal with me. All my tools cover multiple files and TD1 puts them in nice directories with the dates.
I did update the firmware first thing out of the box and the process was pretty nice. Connected with a firewire 400 port and ran some Tableau windows software. The software saw my TD1 and recommended the firmware update. It ran without any issue, and I powered down, unplugged everything, and powered back up to reread the firmware. Tableau markets the ease of upgrade and would agree.
I should be able to post my validation, functionality, and speed results in the next couple of weeks. I got to get more progress on Sam and I's Defcon presentation.
-Dave
Forensic PC ordering process was just okay, I submitted on a Saturday after depositing the check and they processed the order on Monday. I got an email stating that I went from order received to paid, but then didn't hear anything for 8 days. I wrote a note about the status and got an apology email saying that I should have got a message (maybe spam filtered) telling me about the delay on the TD1 and the case. Since I filter spam and not delete, I checked and there was no message. I did get emails on the ship status and tracking and it arrived yesterday - Whoo-hoo!
Ok, enough overhead on the story. I unpacked and inventoried everything and was impressed with the unit size and features. I had previously used the Voom HC II and noticed a few differences that what I was used to. First, speed. I ran it through some testing (full output spreadsheet to come when complete) and the speed was impressive at 6GB+ on my equipment with MD5 and SHA1. My initial tests were mostly functionality and not to quantify the speed but happy right away with the overall speed with SATA disk to disk, disk to file, and wipe. Second, I like the setup and input of examiner and case info. I thought it might suck with slow typing but since I am used to IPhones it was that bad (I read that you can use a USB keyboard, but that is a future test).
Now a little of the not-thrilled-about / maybe-getting-used-to. Voom HC2 had NTFS format and could create a full size disk-to-file, e.g. 80GB drive to a 80GB file. Sure it had a funky thing with once you mount a Voom HC2 NTFS drive on any system it was not recognizable by the Voom again, but I like having large files without a follow up conversion. TD1 can create FAT32 formats and the underneath structure of the TD1 seems that it is based on "chunks" and configuring the size of the chunks. I processed some images and am not sure that it will be a big deal with me. All my tools cover multiple files and TD1 puts them in nice directories with the dates.
I did update the firmware first thing out of the box and the process was pretty nice. Connected with a firewire 400 port and ran some Tableau windows software. The software saw my TD1 and recommended the firmware update. It ran without any issue, and I powered down, unplugged everything, and powered back up to reread the firmware. Tableau markets the ease of upgrade and would agree.
I should be able to post my validation, functionality, and speed results in the next couple of weeks. I got to get more progress on Sam and I's Defcon presentation.
-Dave
Thursday, May 13, 2010
Defcon 18 Presentation
Good news, Sam and I got an announcement this morning that we have been accepted by Defcon for our presentation "A New Approach to Forensic Methodology - !!BUSTED!! case studies". We are pretty excited and always love Vegas - it is the bomb.
The presentation is shaping up nicely and Sam is working on the software component that will really demonstrate our practical methodology. Again, very excited. Buzz me if you want some up front information, but I'll probably hold off on posting some of the more interesting details until we get most of the work behind us.
Ok, a completely different topic. I am loving my setup for my primary system at home. A quick review:
Intel i7 chip, custom cooling, overclocked to i7-965 using the Easy Tune app from Gigabyte MB. Stress tested with Prime95 keeping the CPU / system temp under 80C at full load, 43C and 46C typical load. Windows 7 64-bit, 8GB of mem, 4 1TB drive, 1 1.5 TB drive, ESATA for Thermaltake BlackX.
Ok, the part I like: I have become a big fan of Sun VirtualBox. I can't put my finger on it, but my total experience is that it seems less invasive that VMware and gives me everything I want. VMs have 1GB ram and different levels of CPU cores assigned. VM's include DeveloperXP, Ubuntu-64 (developer and workstation), Forensic XP, SIFT Workstation imported from VMware, Dirty XP (checking out dubious sites and software), and Georgetown XP. I also have a separated malware XP and Ubuntu systems with additional protections.
Best news, it runs like a champ - I don't feel any pain when running VMs and AV / Secunia PSI. I can schedule snapshots and file them away. Da Bomb-bay!
BTW, see you in Vegas for Defcon and BlackHat - I love the vendor parties!!!
.
The presentation is shaping up nicely and Sam is working on the software component that will really demonstrate our practical methodology. Again, very excited. Buzz me if you want some up front information, but I'll probably hold off on posting some of the more interesting details until we get most of the work behind us.
Ok, a completely different topic. I am loving my setup for my primary system at home. A quick review:
Intel i7 chip, custom cooling, overclocked to i7-965 using the Easy Tune app from Gigabyte MB. Stress tested with Prime95 keeping the CPU / system temp under 80C at full load, 43C and 46C typical load. Windows 7 64-bit, 8GB of mem, 4 1TB drive, 1 1.5 TB drive, ESATA for Thermaltake BlackX.
Ok, the part I like: I have become a big fan of Sun VirtualBox. I can't put my finger on it, but my total experience is that it seems less invasive that VMware and gives me everything I want. VMs have 1GB ram and different levels of CPU cores assigned. VM's include DeveloperXP, Ubuntu-64 (developer and workstation), Forensic XP, SIFT Workstation imported from VMware, Dirty XP (checking out dubious sites and software), and Georgetown XP. I also have a separated malware XP and Ubuntu systems with additional protections.
Best news, it runs like a champ - I don't feel any pain when running VMs and AV / Secunia PSI. I can schedule snapshots and file them away. Da Bomb-bay!
BTW, see you in Vegas for Defcon and BlackHat - I love the vendor parties!!!
.
Thursday, May 6, 2010
Facebook Arrays
Helping a friend out with a Facebook application and I had to deal with an array of array export from a multiquery FQL. Sheeesh, fields and values mis-matched all over the place! However, using dynamic PHP arrays it makes it a little easier with the following code.
#Multiout is the array delivered from the FQL
foreach($multiout as $fqlset) { #Strip the wrapper array
$messagebody = $fqlset[fql_result_set];
foreach ($messagebody as $messageArray) { #Strip the message vs. metadata
foreach ($messageArray as $key => $value) { #Finally get to drop the 'record name' and value
$ordered_array[$key][] = $value;
}
}
}
At the end of the process, you have an ordered array that was combined by the multiquery.
#Multiout is the array delivered from the FQL
foreach($multiout as $fqlset) { #Strip the wrapper array
$messagebody = $fqlset[fql_result_set];
foreach ($messagebody as $messageArray) { #Strip the message vs. metadata
foreach ($messageArray as $key => $value) { #Finally get to drop the 'record name' and value
$ordered_array[$key][] = $value;
}
}
}
At the end of the process, you have an ordered array that was combined by the multiquery.
Monday, May 3, 2010
Why I think a lot of online (potentially you) blogger are idiots!
Yes, idiots - you know, filling the ID-10-T form in triplicate. Yes, yes, I'll choose most dictionary first listing, "an utterly foolish or senseless person", and not the psychology term "a person of the lowest order in a former classification of mental retardation, having a mental age of less than three years old and an intelligence quotient under 25". I don't think they are that bad.
Ok, as you might know, I am a fan of critical thinking (link to the wiki description) and it appear that more and more arguments are relying on emotional arguments and arguments without sound logic or reasoning. A little bit of everyone dies when we have nothing but emotional arguments to make points (that is supposed to be funny, cause I didn't have any reasoning or logic and tried to convince you of a point).
What happened to making points with reason to educate, pontificate, or discuss subjects? You then create your counter points and summarize and if your argument has merit, then you might convince someone of your point-of-view. I don't even care how lame or how much I disagree, I'll listen or read and process.
Also, it used to be easy to avoid because you could learn the fanatical conversations and steer clear of the subjects, like IT certification (google search), Microsoft vs. Novell, or Windows or *nix, and so on.
Final thoughts:
1. If you win an argument with emotional arguments, say with "You don't want our country nuked, do you", aren't you just going to lose the argument to someone else with something similar? Say "Less baby seals will get clubbed by saving electricity and going green all over".
2. Research it! What are your most valid points and what are the best counter points for you to address?
3. Respect it! Treat people like idiots and they will either be pissed off or act like idiots.
-Dave
Ok, as you might know, I am a fan of critical thinking (link to the wiki description) and it appear that more and more arguments are relying on emotional arguments and arguments without sound logic or reasoning. A little bit of everyone dies when we have nothing but emotional arguments to make points (that is supposed to be funny, cause I didn't have any reasoning or logic and tried to convince you of a point).
What happened to making points with reason to educate, pontificate, or discuss subjects? You then create your counter points and summarize and if your argument has merit, then you might convince someone of your point-of-view. I don't even care how lame or how much I disagree, I'll listen or read and process.
Also, it used to be easy to avoid because you could learn the fanatical conversations and steer clear of the subjects, like IT certification (google search), Microsoft vs. Novell, or Windows or *nix, and so on.
Final thoughts:
1. If you win an argument with emotional arguments, say with "You don't want our country nuked, do you", aren't you just going to lose the argument to someone else with something similar? Say "Less baby seals will get clubbed by saving electricity and going green all over".
2. Research it! What are your most valid points and what are the best counter points for you to address?
3. Respect it! Treat people like idiots and they will either be pissed off or act like idiots.
-Dave
Friday, April 30, 2010
Submitted to Black Hat and Defcon - Forensic Methodology
Whew, done with that. Sam and I have submitted our Digital Forensics Methodology presentation to Black Hat and Defcon and we are looking forward to a Vegas trip. Vendor parties, fantastic presentations, booze, and gambling are coming our way. I am not sure we will get accepted to Black Hat based on going for a bunch of years and knowing their program, but why not. Should be fun and I have contributed lately.
Another project that is poking up, is that I finally got a update on my CDROM project. Broken and slashed CDs involve getting structural integrity, clearing the media for read, and then the right software the multiple errors, and my project manipulating the cdrom with ATA command set and drivers to 1x speeds and slower.
Hey - ask me questions, I'll do my best.
Another project that is poking up, is that I finally got a update on my CDROM project. Broken and slashed CDs involve getting structural integrity, clearing the media for read, and then the right software the multiple errors, and my project manipulating the cdrom with ATA command set and drivers to 1x speeds and slower.
Hey - ask me questions, I'll do my best.
Monday, April 26, 2010
Forensic Tools - Constant debates
I get this every now and then - "what tools do I use"? Meh, of course I am more about the process and using the right tool for the job, BUT I recognize the familiar tool bias (you like what you know) and personal bias towards the way I like to approach problems. I like to check work from multiple tools and note that in my summary of findings. So I am answering this from the primary tool perspective. With that said, here I go on about the overall forensic tool kit.
Overall forensic tool kit - X-ways Forensics, combined with the $199 version of DTSearch. I used to be almost 100% Encase, then migrating to Access Data FTK, but now mostly X-ways. I feel it is as flexible as it can be, I don't have to do a monolithic import to get thing going. I just mount the image read-only and start the DTindex and open in X-ways and start processing. I have been using Access Data FTK as the backup and when I have multiple cases that need processing at the same time, and I still check my work with Carrier's Sleuth Kit. I believe I use tools fairly agnostic, but I just have not needed to reach back to my older version of Encase. Also, I never get into flame wars about how your choice rocks and everyone else is bad - I just but things in a category of the good and bad parts of using whatever tool you are talking about.
SIFT workstation, I love version 2.0 and have been warming to the idea of VM forensic kits with shared folders that allow to use the combination of win32 and *nix tools without large copy times or loading external drives.
Oh, running late - follow up later
Overall forensic tool kit - X-ways Forensics, combined with the $199 version of DTSearch. I used to be almost 100% Encase, then migrating to Access Data FTK, but now mostly X-ways. I feel it is as flexible as it can be, I don't have to do a monolithic import to get thing going. I just mount the image read-only and start the DTindex and open in X-ways and start processing. I have been using Access Data FTK as the backup and when I have multiple cases that need processing at the same time, and I still check my work with Carrier's Sleuth Kit. I believe I use tools fairly agnostic, but I just have not needed to reach back to my older version of Encase. Also, I never get into flame wars about how your choice rocks and everyone else is bad - I just but things in a category of the good and bad parts of using whatever tool you are talking about.
SIFT workstation, I love version 2.0 and have been warming to the idea of VM forensic kits with shared folders that allow to use the combination of win32 and *nix tools without large copy times or loading external drives.
Oh, running late - follow up later
Friday, April 23, 2010
Rootkit Dissection
Following 'some links I came across a pretty good article of a rootkit dissection which totally fall into the category of stuff I like to read - the process used to develop information. Knowing me personally, I frequently drone on about the "how" to process things and using critical thinking to solve issues and fully understand the problem | incident | root cause.
If you are not familiar with critical thinking, I believe it is the foundation for being successful at solving open-set solutions - solutions that have many methods of deriving the solution with various degrees of success such as digital investigations, hardware troubleshooting, or simply fixing your windows installation problem.
I recommend reading the "simply fixing your windows installation link", uhh it has in-depth troubleshooting and Sherlock Holmes quotes.
I think both links show the use of critical thinking and understanding the logic associated to solving complex issues. I'll post some of my favorite moments in troubleshooting, both good (solved) and bad (made an ass out of myself).
Note, I have not read any of these books so check the reviews. I had read books and written papers in college on critical thinking, but I really learned it from a dude named Garth, a mentor I had in high school. He incorporated critical thinking with philosophy and social behavior and I still vividly remember some of the conversations we had 28 years ago. Here's to you Garth, I am glad you refused to buy beer for underage kid and instead changed my life.
If you are not familiar with critical thinking, I believe it is the foundation for being successful at solving open-set solutions - solutions that have many methods of deriving the solution with various degrees of success such as digital investigations, hardware troubleshooting, or simply fixing your windows installation problem.
I recommend reading the "simply fixing your windows installation link", uhh it has in-depth troubleshooting and Sherlock Holmes quotes.
I think both links show the use of critical thinking and understanding the logic associated to solving complex issues. I'll post some of my favorite moments in troubleshooting, both good (solved) and bad (made an ass out of myself).
Note, I have not read any of these books so check the reviews. I had read books and written papers in college on critical thinking, but I really learned it from a dude named Garth, a mentor I had in high school. He incorporated critical thinking with philosophy and social behavior and I still vividly remember some of the conversations we had 28 years ago. Here's to you Garth, I am glad you refused to buy beer for underage kid and instead changed my life.
Wednesday, April 21, 2010
Working Hard
Way to much going on - I've started a mandatory security training program and am spending a large portion of my week presenting and (hopefully) empowering individuals to make intelligent security decisions.
I've been working on Defcon presentation with a forensic methodology that is representative of the on-the-job training I give my forensic specialists. Here is the abstract that I have assembled that help clarify the issues that am looking to improve upon.
I've been working on Defcon presentation with a forensic methodology that is representative of the on-the-job training I give my forensic specialists. Here is the abstract that I have assembled that help clarify the issues that am looking to improve upon.
A new approach to Forensic Methodology and !!BUSTED!! case studies.
Imagine the following experiment, a unique case is given to three digital forensic analysts and each is given the opportunity to engage the requester in order to develop the information needed to process the case. Based on the information gathered, each of the three analysts is asked to provide an estimate to complete the investigation and can proceed with up to 20 hours to process the case. The analysts are then measured based on the total findings, the time required to process the case, the initial information gathered, and the estimated time to process the case. The expected result is to be varied based on experience and individual characteristics, such as organization, discipline, and the attention to detail of each analyst. Imagine this same experiment but with only 8 hours to process the case, because that is the way it happens in real life.
David Smith and Samuel Petreski have developed a methodology that fits within the Analysis phase in one of the standard Digital Forensic Analysis Methodologies - PEIA (Preparation, Extraction, Identification, and Analysis), to provide a structure for consistent results, better development of the requested goals, increase efficiency in fulfilling the goals, and develop an improved estimate of the time required to complete the request.
This methodology involves the generation and validation of case goals, the evaluation of methods used to achieve the goals, a structure for estimating the effectiveness, time required, processing results of specific methods, and generalized organization and time management. The primary goal of this methodology is to address the structure and optimal path that would allow a digital forensic examiner to perform an examination with a high level of efficiency and consistent results.
This presentation provides an introduction to this methodology and applies its key concepts to real sanitized digital investigations, such as tracking down a suspected executive's adult craigslist ad, performing an analysis on a compromised system involving social security numbers, and making the determination of intellectual property theft.
Should be fun. BTW, I love this book:
Monday, April 19, 2010
Super Timeline - Rob Lee & crew
So now you have probably heard about Super Timeline from Rob Lee's SAN page - http://blogs.sans.org/computer-forensics/2010/03/19/digital-forensic-sifting-super-timeline-analysis-and-creation/.
Good stuff, I don't know if you had tried log2time, but my first thought was wow, it would be great if it could go and find all of the artifact log files. Well, they did that too - TimeScanner was added to search the drive and send the output to log2time, sweet! Combined in the future is what I have read.
Rob Lee combined this with Carvey's registry time perl script, fls from the Sleuth Kit, and jacks it all together with old school mactime.pl, also from the SleuthKit. Rob makes putting this easier by having these components in the SIFT Workstation (info found in the Super Timeline page).
Ok, intro done... I have run some tests against older cases and loved the results*. It USED to be a lot of work to get log source 1 and log source 2, consolidate them, and review. Then make a determination if log source 3 was needed. This makes it much quicker and moves it up the SP index (SPI) appropriately, since the SPI is a combination of factors, including estimated time and estimated effectiveness in meeting the case goals. SPI is an artifact from the digital forensic methodology I have been teaching my staffs and formalizing into a presentation for Defcon 18 and Black Hat 2010
*For legal purposes - I didn't find any data that changed any of my conclusions, but enhanced the conclusions that I or my teams generated.
Good stuff, I don't know if you had tried log2time, but my first thought was wow, it would be great if it could go and find all of the artifact log files. Well, they did that too - TimeScanner was added to search the drive and send the output to log2time, sweet! Combined in the future is what I have read.
Rob Lee combined this with Carvey's registry time perl script, fls from the Sleuth Kit, and jacks it all together with old school mactime.pl, also from the SleuthKit. Rob makes putting this easier by having these components in the SIFT Workstation (info found in the Super Timeline page).
Ok, intro done... I have run some tests against older cases and loved the results*. It USED to be a lot of work to get log source 1 and log source 2, consolidate them, and review. Then make a determination if log source 3 was needed. This makes it much quicker and moves it up the SP index (SPI) appropriately, since the SPI is a combination of factors, including estimated time and estimated effectiveness in meeting the case goals. SPI is an artifact from the digital forensic methodology I have been teaching my staffs and formalizing into a presentation for Defcon 18 and Black Hat 2010
*For legal purposes - I didn't find any data that changed any of my conclusions, but enhanced the conclusions that I or my teams generated.
Sunday, April 18, 2010
Defcon 18 and Black Hat
Anyone else getting excited (although it is really early) for Defcon 18 and Black Hat in Las Vegas? I am! I'm also working on a presentation with Sam Petreski on a new approach to forensic methodology, which I feel is really interesting. I haven't presented at Defcon or Shmoocon in a couple of years, although I have good stuff that I am working on. FOLLOW-THRU!
It gets away from the classic framework methodologies like:
Classic DOJ PEIA
Classic B Carrier
Fantastic whitepaper comparing methodologies
(Unfortunatley all PDF's so be careful with your patches)
But instead focuses on the analysis phases of the forensic specialist, from the initial information gathering to preparing to develop the report - i.e. when the "man" sits in front of the forensic PC loaded with tools and the images to examine.
Here is the abstract which helps develop the methodology.
Interested in more? Here are some books Amazon recommends - don't worry, I won't put a book out that I haven't read and think is worth it.
Don't ever get called into a deposition or court without reading this book! It is the bomb-bay and after understanding the concepts, it will be you go-to-book when you have a follow-up.
Uh, yea. Another must read. I have it on my kindle now for quick(er) reference as well. I seem to absorb more each time I read it and really like the 2nd edition.
It gets away from the classic framework methodologies like:
Classic DOJ PEIA
Classic B Carrier
Fantastic whitepaper comparing methodologies
(Unfortunatley all PDF's so be careful with your patches)
But instead focuses on the analysis phases of the forensic specialist, from the initial information gathering to preparing to develop the report - i.e. when the "man" sits in front of the forensic PC loaded with tools and the images to examine.
Here is the abstract which helps develop the methodology.
A new approach to Forensic Methodology and !!BUSTED!! case studies.
Imagine the following experiment, a unique case is given to three digital forensic analysts and each is given the opportunity to engage the requester in order to develop the information needed to process the case. Based on the information gathered, each of the three analysts is asked to provide an estimate to complete the investigation and can proceed with up to 20 hours to process the case. The analysts are then measured based on the total findings, the time required to process the case, the initial information gathered, and the estimated time to process the case. The expected result is to be varied based on experience and individual characteristics, such as organization, discipline, and the attention to detail of each analyst. Imagine this same experiment but with only 8 hours to process the case, because that is the way it happens in real life.
David Smith and Samuel Petreski have developed a methodology that fits within the Analysis phase in one of the standard Digital Forensic Analysis Methodologies - PEIA (Preparation, Extraction, Identification, and Analysis), to provide a structure for consistent results, better development of the requested goals, increase efficiency in fulfilling the goals, and develop an improved estimate of the time required to complete the request.
This methodology involves the generation and validation of case goals, the evaluation of methods used to achieve the goals, a structure for estimating the effectiveness, time required, processing results of specific methods, and generalized organization and time management. The primary goal of this methodology is to address the structure and optimal path that would allow a digital forensic examiner to perform an examination with a high level of efficiency and consistent results.
This presentation provides an introduction to this methodology and applies its key concepts to real sanitized digital investigations, such as tracking down a suspected executive's adult craigslist ad, performing an analysis on a compromised system involving social security numbers, and making the determination of intellectual property theft.
Interested in more? Here are some books Amazon recommends - don't worry, I won't put a book out that I haven't read and think is worth it.
Don't ever get called into a deposition or court without reading this book! It is the bomb-bay and after understanding the concepts, it will be you go-to-book when you have a follow-up.
Uh, yea. Another must read. I have it on my kindle now for quick(er) reference as well. I seem to absorb more each time I read it and really like the 2nd edition.
Saturday, April 17, 2010
How can you not love Bruce Schneier's CryptoGram?
I am not sure if you are getting this newsletter, but always a thrill-a-minute. I can't say I go with him on all of his positions, but always thought provoking and well thought out.
http://www.schneier.com/crypto-gram-1004.html - link.
Great this month is his analysis on the story of "Facebook Chief Executive Mark Zuckerberg declared the age of privacy to be over". Yes, I am pretty tired of hearing "users are idiots", so it was refreshing to have a position article on that the moneymakers of the world are working really easy for you to lose your privacy.
Perl script for renaming music files
Ok, I used one of the free ipod backup programs and it copied all of the file with native format, e.g. UBOT.m4a - great, now I have to write a perl program to read the metadata and rename the files.
Quick and dirty, but here you go:
#!/usr/local/bin/perl
use MP4::Info;
use MP3::Info;
MP3s();
MP4s();
sub stripUnwanted
{
my $filename=shift;
$filename =~ tr{\\\/}{-};
$filename =~ tr{*?}{X};
$filename =~ tr{“><[]|:;,’=\"}{_};
return $filename;
}
sub MP3s {
@files = <*.mp3>;
foreach $file (@files) {
my $tag = get_mp3tag($file);
$artist = $tag->{ARTIST};
$title = $tag->{TITLE};
$artist = stripUnwanted($artist);
$title = stripUnwanted($title);
print "Artist: $artist - Title: $title\n";
rename ($file, "$artist-$title.mp3");
}
}
sub MP4s {
@files = <*.m4*>;
foreach $file (@files) {
my $mp4 = new MP4::Info $file;
$artist = $mp4->artist;
$title = $mp4->title;
$artist = stripUnwanted($artist);
$title = stripUnwanted($title);
print "$artist - $title\n";
rename ($file, "$artist-$title.m4a");
}
}
Quick and dirty, but here you go:
#!/usr/local/bin/perl
use MP4::Info;
use MP3::Info;
MP3s();
MP4s();
sub stripUnwanted
{
my $filename=shift;
$filename =~ tr{\\\/}{-};
$filename =~ tr{*?}{X};
$filename =~ tr{“><[]|:;,’=\"}{_};
return $filename;
}
sub MP3s {
@files = <*.mp3>;
foreach $file (@files) {
my $tag = get_mp3tag($file);
$artist = $tag->{ARTIST};
$title = $tag->{TITLE};
$artist = stripUnwanted($artist);
$title = stripUnwanted($title);
print "Artist: $artist - Title: $title\n";
rename ($file, "$artist-$title.mp3");
}
}
sub MP4s {
@files = <*.m4*>;
foreach $file (@files) {
my $mp4 = new MP4::Info $file;
$artist = $mp4->artist;
$title = $mp4->title;
$artist = stripUnwanted($artist);
$title = stripUnwanted($title);
print "$artist - $title\n";
rename ($file, "$artist-$title.m4a");
}
}
Saturday, April 10, 2010
Back on track - working on lots of projects..
I'm back and have lots of good info to share. I am working on my Defcon presentation and better password dictionary development.
That's it for now, but good stuff coming.
That's it for now, but good stuff coming.
Subscribe to:
Posts (Atom)