Quantcast
Channel: nibble on dav nads
Viewing all 33 articles
Browse latest View live

SANS DFIR Summit, Forensic4cast award, my presentations, now back to work!

$
0
0

The SANS Digital Forensic Incident Response Summit in Austin ROCKED! Rob Lee and all the SANS folks put on an awesome show.

SANS 508

For me it started with the new SANS 508 class. If you haven't seen the advertisements, check out "The APT is already in your network. Time to go hunting -- Learn how in the new training course SANS FOR508". All I can say is it's true.. Here's a few reasons why:

  • Conducting APT investigations requires "outside of the box" thinkers and 508 framed that picture well with a cutting edge curriculum. For instance, you learn necessities that are not even in commercial products yet such as Volume Shadow Copies. How you going to mount these with Encase or FTK??
  • I have experience teaching and know first hand how difficult it is to create labs. It was obvious that months if not years of effort where put into the new 508 lab. Also, having had real world experience conducting APT investigations, I can tell you that the labs are so real its scary! No joke, 2 weeks later and I am still playing with the lab images provided.
  •  
  • Speaking of labs, almost every section in the course has a lab associated to it. So not only do you learn about concepts you get to apply them hands on. The labs aren't point and click like some other training providers, these actually require thinking! I was also told that the labs build on each other throughout other SANS courses. For example, the malware you recover in the SANS 508 lab is the same malware you analyze in the SANS 610 - Reverse Engineering class.
  •  
  • Unlike other classes where I have always been the first one to finish and solve all the problems. I can honestly say I was challenged in 508 (Yes, Rob Lee, I was paying attention between conference calls :-)). For me, the memory analysis (Volatility and Redline) section was the biggest learning curve. Advanced topics like these can be eye openers to the fact there is always room to improve skills and keep learning at any level. 
  •  
  •  I was most impressed by how all the section content (e.g. file systems, memory analysis, timeline analysis, etc) all came together. Every investigation starts the same way, or at least should, with a analysis plan. 508 did a great just explaining how and when to use the various tools/methods introduced from a tactical perspective.
  •  Oh yeah, how can I forget .. We also got a copy of F-Response tactical, a 64 GB thumb drive, and the book File System Forensic Analysis .. now that is awesome!! 
Overall SANS 508 was an awesome class. I have to give a shout out to Alissa Torres who taught part of the memory analysis section and did a GREAT job. She also gave one of the best presentations at the summit, "Reasons Not to "StayinYourLane" as a Digital Forensics Examiner".

Also, the SANS @ Night presentation by Paul Henry on setting up VMware ESXi on Mac Minis was really different and cool. I might have to go buy a few Mac minis now..

SANS DFIR Summit

Now on to the summit part.. Having been to a lot of industry conferences, if you are a person who enjoys hard core DFIR and don't want to be annoyed by eDiscovery nuisances, the SANS DFIR summit is the premier place to learn, network and collaborate.

In my opinion, the networking and collaboration opportunities alone are worth attending for. By all means I have some good friends at home in Chicago, but there not geeks. Some time all I want to do is talk DFIR. On that note, I did nothing but talk geek to folks (too many to list) who I had never met in person before and old friends. In fact, I think David Kovar, Tom Yarrish and I collaborated a little too much... we could keep a team of programmers (or maybe just Steve Gibson) busy for the next year with all the great ideas we cooked up. Speaking of Steve Gibson, thanks for being a great local host in your home town, Austin.

Stemming from a conversation with David Kovar and Rob Lee's panel, if I could give one suggestion for next year, it be great to have some round table discussions on various topics. For instance, bring representatives from Guidance, Access Data, Internet Evidence Finder, etc in a room with the community and discuss how we can standardize things such as timeline outputs, evidence file formats, etc.. in open forum. Alternatively, perhaps have small break-off round table discussions (focus groups) with experts leading it.. so you could have like Kristen lead a break off on log2timeline, and have a bunch of fans or interested users talk openly open thoughts, wish list items, challenges, etc.

Oh yeah, if you havent seen the Closing Remarks for the SANS DFIR Summit. This is a MUST WATCH!!!

Forensic4cast Award

My Forensic4cast Award!
I am happy to announce that I won a forensic4cast award last week -- for writing "best forensic article of the year". For anyone that has not familiar, the article was titled Digital Forensics Sifting Cheating Timelines with log2timeline and had a accompanying reference guide that could be downloaded.

Thank you everyone who voted for me. It's great motivation to continuing to take initiative. What's next? Vote "davnads" for prezident!

Also thank you Lee Whitfield for putting this all together!

My DFIR presentations CEIC and SANS

I received positive feedback from a blog I posted a few months ago on Intellectual Property theft. So I decided to expand on this topic at Guidance Software's CEIC user conference. Ed Goings, Rick Lutkus, Dave Skidmore, and I organized a panel, titled "Investigating Intellectual Property theft", . This was turned out fantastic with our combined legal, corporate, and consulting perspectives. In fact I was shocked we had people standing at the back of the room at 8 AM in Las Vegas. In fact, I wasn't even sure if I would make it ;-) If you would like a copy of the presentation feel free to contact me.

Chad Tilburry, an AWESOME forensicator and SANS instructor, invited me to speak on his SANS DFIR Summit panel regarding "Building and Maintaining Digital Froensic Labs". I was excited to hear from people including Ken Johnson , who blogged about it, DFIR SUMMIT - Through the Eyes of a Summit Noob , that they found this presentation valuable.

I also gave a SANS 360 talk on the tool I have been developing. This was was recorded for viewing and my presentation can be found last (1:06:38 mark). Sorry the sound quality is not so great and the SANS laptop had technical difficulties (awk!) displaying the embedded video in  my presentation. The actual embedded movie can be downloaded as well(there is no sound). 



A slide from my SANS 360 talks
More to come on my tool soon -- In summary if you are not familiar with Kristinn Gudjonsson’s log2timeline, a framework for automatic creation of timeline data,  it's a "go to" tool for anything DFIR timeline analysis. If you have used the tool, you’ll also know that the amount of output for even just one computer can be a tremendous amount of data to review. Also there is no method specifically designed to review timeline data.  

Therefore I created a proof of concept front end for log2timeline data output.  It allows for easy filtering and reviewing timeline data. It is coded in Python (cross-platform) with a SQLite database backend and WX GUI. An example of its use is to aggregate timeline data from multiple hosts into one timeline to see lateral movement.

All SANS Summit presentations can be downloaded.

Now that my speaking engagements, conferences, and training budget is all dried up. I will get back to saving the world one megabyte a day ( :-)
 


Timeline Analysis - What's missing & What's coming..

$
0
0
If you missed my SANS 360 on timeline analysis...

What the heck is timeline analysis??  

Timeline creation and presentation is the concept of normalizing event data by time and presenting it in chronological order for review. This sequence of event data becomes a narrative “a story” of events over a period of time. Furthermore, it can be used to put events into context, interpret complex data and identify anomalies or patterns. The concept of timeline creation and presentation is widely used amongst many practices including Digital Forensics and Incident Response (DFIR)

For DFIR purposes, timeline creation and presentation primarily consists of recursively scanning through a file system (or linear through a physical or partition disk image) and extracting forensic artifacts and associated timestamp data. The data is then converted to a normalized structured format in which it can be subsequently reviewed in chronological order.

Creation and Filtering

A tool named“log2timeline”, by Kristinn Gudjonsson, is a example of a framework for automatic creation of forensic timeline data. If you are interested in learning more about timeline creation and analysis using log2timeline I suggest starting with Kristinn's list of resource or taking the NEW SANS 508 class (here's a review I authored based on my experience). The main purpose of log2timeline is to provide a single interface to parse various log files and artifacts found in evidence such as file system data, windows event logs, windows registry last written times and Internet history. The data is then output to a structured format such as CSV, SQLite, or TLN.

After the timeline is created, it can be filtered using “l2t_process”. This tool allows a user to “reduce” the size of the timeline by creating a subset of data responsive to certain keywords or time/date restrictions. For instance, a 5 GB timeline file could be filtered to 3 GB by running l2t_process using a date filter to only show events that occurred between 2009 - 2010.

Presentation

At the time of writing this there is no commercial or open-source tool specifically designed for DFIR professionals to review the output of log2timeline or forensic timeline data in general. Therefore, DFIR professionals are limited to using tools not specifically designed for forensic timeline data presentation such as Microsoft Office Excel, Splunk or grep. This limitation decreases productivity and increases the risk of error.

Some deficiencies of current presentation options
Microsoft Office Excel is a common method of reviewing forensic timeline data. Although Microsoft Excel is a intuitive and robust, it has fundamental limitations. For example the average output from log2timeline (based on a 320GB hard drive) is 5-10 million rows of data equaling approximately 3-5GB. Microsoft Excel version 2010 has a row limitation of 1,048,576 rows and version 2003 has a limitation of 65,536 rows. This severely limits DFIR professionals to view parts (“mini-timelines”) of the overall timeline often based on filtering criteria (pivot points) such as date ranges, keyword searches, or source types. In result, context can be taken away by not having the entire timeline to review. It also can make reviewing timeline data a iterative process by having to review multiple mini-timelines. 

Slide from my SANS 360 talk
On November 19, 2011 Klien&Co published an article documenting how to empower Splunk to review timeline data. Splunk is a  robust enterprise based application that collects and indexes data from various data sources. Splunk will store both raw and rich data indexes in an efficient, compressed, filesystem-based datastore, with optional data signing and auditing to prove data integrity. However, Splunk is complicated to use as it requires knowledge of Command Line Interface (CLI) and specific training on the tool. It is also difficult to generate reports and administer as a user.

grep, a CLI tool, is another option to parse and review forensic timeline data. However, for the average DFIR professional who is not familiar with CLI it can be a complicated and a inefficient method.

The Need

A better #$%!$@ way to review timelines [period].

The goal of my first phase of development was to create a forensic presentation tool specifically for timeline data. This would be a robust Graphical User Interface application that does the following:
  • Import structured timeline data such as log2timeline CSV file into a structured database. This would allow for fast indexed searches across large data sets.
  • Upon import, the application would allow the user to preserve source information. This will allow a practitioner to review data from multiple data sources in a SUPER timeline and easily correlate events across these different sources.
  • Subsequently, the forensic timeline data will be displayed for review in a Graphical User Interface (GUI) data grid similar to Microsoft Excel. It will have familiar features such as the ability to sort, filter, and color code rows by column headings or values.    For instance, a user could have the ability to import timeline data from 10 different hosts, filter to only show successful logons (based on evt log source types) between 2009 and 2010 and color color code the results by host to make the review process easy on the eyes :-)
  • Unlike Excel make filtering transparent.. visually see and understand  how the buttons you are pressing interact with the database and the results you are presented with -- sql query builder.
  • The interface would also be intuitive to the extent a user could create user defined tags, comments, and bookmarks for the purpose of reporting, filtering and assisting review. For instance, a user could create the tag “evidence of IP theft” and subsequently select one or multiple rows in the data grid and associate them with this tag -- just like you can in eDiscovery!!
  • At any point timeline data generate or reports or export data from the grid view. For example, export a filtered subset of data back into the CSV format to open in Excel or send to someone else? 
  • Ability to create custom queries.. so user is not limited by the GUI - think plugins!!!
  • Also, basic charting capability because "a picture can sometimes tell a thousand words".
The Solution
Let me start of by saying does anyone know what it feels like to stare at code for 5 hours (on a Saturday afternoon when its 80 degrees and sunny out , with no bathroom/food breaks, and all of your friends are at the beach?) trying to figure out why your code is broken, then to find out it's because your missing a single curly bracket somewhere? Well that's been my life for the last 12 months since I started my coding project. If you don't believe me -- ask my friends, oh wait I don't have any anymore - this tool has ruined my life :-)
Picture of new GUI with undockable panes for multiple monitor setups

If you have not had an opportunity to watch the recorded video (1:06:38 mark) of my SANS DFIR Summit 360 talk from the or review slides , I introduced the proof-of-concept tool I have been coding. Here is a short video (no sound) of the tool in action (note this is the first release - and the GUI has significantly changed since)

The tool consists of:
  • WX GUI from-end
  • Python Code
  • SQLite backend
Shout out to my high school sciene teacher, Mr. Wilson, who introduced me to Python. I used Python because it's cross-platform. My development and testing platform is Windows 7. At the DFIR Summit, I gave Tom Yarrish a copy of my tool and within minutes he had it running on his Macbook Pro running OSX. Pretty cool..

You can see auto-highlighting by source type and POC charting here..
I will never understand why people prefer in the year 2012 to still type things into green and black console windows? Therefore, I used WX as a GUI front-end. Why did I use WX? Simple because it's the first thing that came up in my Google search for "Python+GUI+programming". In hindsight I wish Google told me just to quit.

Also used SQLite3 as a back-end because A.) It's lightweight - no install required B.) You know its fast if high-frequency traders use it C.) It's scalable enough to review timeline data.
Overview of current process and development phases:
 


Overview of data flow:

In red I am working on in Phase 2.

When can you get it?
 
I currently have someone doing a code review. It will be posted VERY soon on the log2timeline-tools google code page -  http://code.google.com/p/l2t-tools/

As I stated in my SANS 360 talk, "it will be free to corporate and private but LE has to pay for this one.. you guys need to pay me back for all those parking tickets!"-- I might also post a donation page or something.. so I can buy myself a vacation or something. 


Also I really look forward to feedback positive/negative so I can improve and include thoughts in my future employer performance discussions so I dont wind up becoming a Walmart Greeter :-)


Timeline Analysis - More of what's coming..

$
0
0
So your kicking back in your chair, with your feet up in the air, reviewing some timeline data in M$ excel like a timeline bandit. Your filtering things, highlighting rows, making notes, and everything is just f$%ing fantastic.

Then out of no where .. Your boss walks into the room!! He pulls a chair up next to you.. kick back, and spark up some "conversational forensics". You enthusiastically tell them about all the amazing artifacts you have found in your timeline,  and then use this great opportunity to ask them some really hard questions. The conversation takes a turn to hands on the keyboard as Your boss  looks over your shoulder...

Your boss asks "pull up those 64 files on the screen that are highlighted as red in your timeline." There is an awkward pause in the conversation as you realize you forgot your Encase dongle at home. Next you feel your hands getting sweaty as you know you don't have SIFT installed either. Just before you start crying in fear of humiliation, you remember that  imdisk, a free image mounting utility, is installed on your computer.

As you take a deep breath of air and regain your composure your able to quickly mount the DD image in read-only mode. You start digging through the file system for the 64 files... Minutes go by and you finally find the first file. At this point you realize this process of looking up file paths in your timeline and opening files is a manual and time consuming effort..but you continue on because there is no way of automatically tieing items in your timeline to logical files in the mounted disk image (even if you did have SIFT, Encase, or some other fancy tool).

30 mins go by and your still looking for the last few files.. You notice your bosses eyes are starting to close. The next thing you know he's sleeping in his chair. He wakes up 15 mins later and says he had this dream that Dav Nads came up with this idea on how to mesh timelines, kittens, and data from hard disk images all together... and he was right .. well atleast not about the kittens part :-)

-------------------------------------------------------------------

I hope you enjoyed my made up story. I am on vacation this week and REALLY bored without any DFIR going on. Lol. Anyways...

As alluded to in Timeline Analysis: The Hybird Approach there's many approaches to creating timeline data. Some prefer a "targeted" approach which only presents specific artifacts on a timeline and others prefer a more "kitchen sink" approach where many artifacts are presented.

Regardless of your flavor, when it comes to reviewing timelines, I am sure you, like me, find yourself jumping between reviewing timelines (e.g. Excel, l2t_Review) and forensic applications. A few reasons I personally do this are to:
  • Gain a better understanding of the artifacts displayed in my timeline
  • Confirm the accuracy of my timeline data
  • Look at the contents of a file
What drives me bananas is the fact I am constantly searching for artifacts in my forensic tools that I have highlighted in my timeline. Sometimes there is so much "back and forth" going on that I loose concentration and sight of the "big picture". Also it does not make it easier if you dont have multiple monitors or a large screen.

So extending on Timeline Analysis - What's missing & What's coming I decided to brainstorm ideas to address this frustration:
  • Timelines contains file name and full path information of source artifact - this is good!
  • You can mount disk images easily with imdisk or ftkimager - okay now I have access to the data where the source artifacts are stored
  • The absolute path/ drive letter (e.g. C:\windows) in the timeline will not typically match that of your mounted disk image (e.g. E:\) - Easy enough to hack a fix with some Python

My next challenge was to determine a means to review files.. Initially it occurred to me I could open files with their default viewer but as Corey Harrell (@corey_harrell) pointed out that's not such a good idea because then your exposed to clientside exploits tied to specific vulnerabilities in apps!

So I started searching for an open source Python based review module and came up dry. However I did come across a REALLY cool Windows-based application called Universal Viewer that suppports a sleuth of file types and modes including native, text, binary, and hex!

So as you can imagine I incorporated all of these ideas into the Windows version (working on equivlent capability in other OS versions) of my l2t_R tool!! 

Just three simple steps: 

1.) Mount disk image with tool of choice (e.g. imdisk, ftkimager, encase)
 

2.) Specify in l2t_Review what drive letter is assigned to the mounted disk image


Select mounted image path


    3.) Invoke File Viewer by Right Clicking on any line item in your timeline and selecting Open File Viewer


    4.) The File Viewer is automatically opened with the file. You can change default view mode (native, hex, text, etc.) using settings. You can also specify in settings whether you want multiple instance of file viewer to be opened simultaneously or not. So every time you open a new file it will either open it in the same instance or a new instance.

    File displayed in viewer in Hex mode. Can also view natively

     I will be posting details on how to download and start using l2t_R very soon!!! In the mean time if there is something you REALLY could use this on feel free to contact me for a beta version.

    -DAV NADS


    Dashboards, File Viewer, Hashing, and Date Plotter in l2t_Review #OMG

    $
    0
    0
    In my recent blog post titled Timeline Analysis - More of what's coming.. I introduced a method using l2t_Review to bring timelines to life with source data.


    Given a mounted disk image of the evidence item you are reviewing and Universal Viewer installed, l2t_Review will allow you view source data. By simply right clicking on any file in the Data Grid pane, and selecting File Viewer, the file will be opened in Universal Viewer. This file viewer supports over 12 views including native, media player, text, hex and hundreds of file types. You can also specify in settings whether you want  File Viewer to invoke multiple instances of Universal Viewer or the same instance every time a file is opened. to be opened or every time you open a new file it will either open it in the same instance of Universal Viewer a new one.


    Building on this existing capability..

    Many times there is the reason to  hash a file, such as when having the need check VirusTotal for a suspicious executable in your timeline. Now, by right clicking on any file in the Data Grid View, and selecting Hash File, a dialog window will appear with the hash value of the file selected. Pretty cool, eh? Down the road will  be the ability to send it directly to VirusTotal.

    Now lets look at two new visually stemming aspects...


    First is a feature built into the main UI, which displays all (not paged) data from the Data Grid View subsequent to filtering. The X axis represents Date and Y axis represents the frequency of event(s) occurred on that Date. This feature is particularly useful for identifying dates with high or low activity. The timeline can be manipulated by zooming in and out and also saved as an image.

     
    Second is a feature I am really exiting about and took me a really long time to do. Now there is the ability to view timeline data in an interactive dashboard subsequent to filtering. This allows you to understand visually what data types are being displayed in your timeline. If there is something that is specifically interesting to you, such as data from user “John”, if you click on “John” in the pie chart it will automatically redefine your results in the Data Grid View to only show data associated with the user “John”. All pie charts are interactive in the sense you can click on data points and filter the data. This is just the beginning as it relates to dashboard, expect a lot more down the road. 

    I am sure some of you are sick of me saying it will be out soon.. honestly anyone who has asked for a copy I have sent them one. So just email me if you want a beta copy.. otherwise will be out one of days after I can find a conference to drop it at. I was hoping to present at the open source forensics conference but never heard back from them.


    #DFIR things DavNads is Thankful for on Thanksgiving

    $
    0
    0


    I hope everyone has a great Thanksgiving. I am going to attempt to deep fry a Turkey tonight so I wanted to get a blog post up in case it’s my last words! There’s often discussion about how to get started in #DFIR or how to get to the next level for those already in the field. Therefore, I thought it be relevant on this day to take a few minutes to write about some things I am thankful for that have helped me be successful in my #DFIR career.

    #Resources (aka weapons)

    I like to use the analogy that the #DFIR battle ground is like a role playing video game. A new game provides your character with the essentials and through the course of your game, you accumulate weapons to build the capabilities of your character. 

    In #DFIR it’s not too much different. Knowing all the answers from the start is not conceivable but knowing where to look for all the answers can be. Therefore, having a arsenal of weapons including blogs, white papers, tools, and even contacts are what enables me on a daily basis to provide answers to questions, solve problems, and prepare for that next battle with the “SASPDT” – Sometimes Advanced, Sometimes Persistent, Definitely a Threat

    I am confident that my arsenal of weapons is what has made me a valuable character on the #DFIR battleground similar to certain video game characters. The only difference is the "SASPDT" can’t steal the account credentials to ME unlike those pesty video game characters. For this I am again thankful of my arsenal of forensic weapons (aka resources).

    #Challenges

    DFIR is not an easy career to “just get by” in. What makes it so difficult? Well I think there are a few factors including the constant changes in technology, process, and interpretation. One’s ability to not only adapt to these changes but help shape the changes are what (in my opinion) separates the button pressers from button builders. This notion of keeping “cutting edge” in the field can be challenging because it can require time, passion, research, and sometimes even ability to develop. However, the reward of solving a challenge often outweighs the effort.

    Personally, I have not always been an “eager beaver” for challenges. I have found that a lack in confidence and belief in your abilities will refrain one (including me) from even trying. One specific challenge that I will always be thankful for was when Ed, my boss, provided me with my first opportunity to respond to a suspected network intrusion. I’ll never forget the conversation we had leading up to it, where I literally tried to convince him I was under qualified and the only thing I was prepared to do was fail. Despite my thoughts, he believed in my abilities, and framed it in a way that gave me confidence to try and succeed. This taught me (1) not to be afraid to try something outside what I was comfortable with and (2) opened the door to an entirely new passion of mine – network intrusions – that would be unknown to me if it wasn’t for facing a challenge in my career.

    Today, I am thankful that challenges are a fundamental part of everything I do. I enjoy waking up every day knowing I could face potential problems that there aren’t solutions too. This gives me the energy and motivation to try to do something new or different like changing the world one megabyte at a time! :-)

    #Role Models & Mentors

    Something I did early on in my career was not only identify select role models but identify what characteristic(s) made them role models to me. For instance, I have always looked up to all the SANS facility (Rob, Paul, Hal, Chad, Alissa, etc..) as role models. Not so much for their “know how” but their unique abilities to articulate and communicate technical knowledge.. now that's something in my opinion that can be one of the most valuable skills. I have then relied on my mentors (Jim, Brian, Steve, J, etc) to help guide me in following the footsteps of my role models.
    Thanks to all my technical and non-technical role models and mentors I have grown personally and professionally in my career in ways I could never accomplish individually. 

    #Community

    I am most thankful for an awesome #DFIR community. How many other communities are out there that have people and organizations so inclined to help others, contribute free tools, and advance capabilities? Also I have met countless new friends thanks to this career path.

    #Material things ; -)
    • RAM – Because the expensive tools don’t work without it.
    • SSD HDDS – So when the expensive tools crash my computer, I can reboot quickly!
    • New Log2timeline– Can you say super timeline analysis?
    • Volatility– When I thought I had enough to look at with hdds, now there’s even more with memory analysis.
    • Python– Because it’s better then Perl.
    • VMware Fusion– Allows me to literally swap with 4 fingers between 5 different Operating Systems.
    • Dual 24” inch monitors– Helps me be make up for productivity in other areas
    • DFIROnline and DFM– Webcasts and good reads
    • VSC toolset– Makes VSC analysis pretty easy!
    • Logicube Dossier– 5-7GB per minute 2 disk duplicator, need I say more?
    • TZworks stuff– Lots of great stuff. 
    • GitHub - Store all my code in the cloud.
    • SharePoint 2010 - Allows me to collaborate with teams on the same documents like Google docs.
    • Gizmodo.com - My favorite tech blog.
    • SANS 508 - I felt like this class really polished my skills. 
    • WFA Toolkit 3E - Great book and reference guide. Hope to have a iPad copy soon.
    • Sprint 4G LTE hotspot - Allows me to be connected anywhere just like I am in the office :)
    • ImDisk Virtual Disk Driver - great free image mounting tool
    • SQLite - Quick and dirty backend to little things here and there.
    • Dcode - Great decoder.
    • GREAT series of blog posts by by Patrick Olsen
       
    Hopefully some of you share these appreciations and others find them resourceful. Now go eat
      some turkey or stand in line for something you don’t need that’s on sale!

    4n6time Release Notice

    $
    0
    0

    After what feels like a year of “not having a life”… I am happy to announce 4n6time :-)

    4n6time, formally "l2t_Review", is a free, cross-platform forensic tool for timeline creation and review. Since 4n6time is powered by Kristinn Gudjonsson’s amazing plaso engine, formally log2timeline, users can now create, with a mouse, a raw timeline storage file from a disk image. Once a timeline has been created, it can be outputted to a 4n6time database (sqlite). Using 4n6time, you can then start review with the ability to filter, highlight, sort, tag, bookmark, and search on common data fields. Also included are basic reporting features as well as the ability to export subsets of data back into the CSV and timeline storage files.

    Here are some highlights of 4n6time:

    • Timeline creation wizard
    • Robust filtering
    • Event tagging, bookmarking, and (auto)highlighting like eDiscovery tools
    • Interactive graphical representation of events
    • File viewing, hashing, and exporting via data source (i.e. linking timeline to disk image or mount point)
    • Basic reporting and charting
    • Appending timelines from multiple data sources (cross-host timeline analysis)
    • Ability to save work product back into timeline storage files

    For more information check out the work in progress User Guide, my blog, or go download an OSX or Windows binary from the Google Code page. Binaries for Linux an SIFT will also be released soon.






    My Windows 8 DFIR Reading List

    $
    0
    0

    Below is my reading list for Windows 8 DFIR. I suspect it’s only a matter of time until everyone sees a hard drive with Windows 8. If you have any other resources to add to the list, feel free to drop a comment and I'll add it to the list.
    Windows 8: Important Considerations for Computer Forensics and Electronic Discovery

    Windows 8 Forensics - A First Look (ForensicFocusVideos)


    Forensic Artifact: Malware Analysis in Windows 8
    Windows 8 Forensics: USB Activity
    Champlain College Windows 8 Forensics 3 Part Series


    Windows 8 Forensics: Reset and Refresh Artifacts
    Windows 8 Forensic Guide

    Stay tuned: 4n6time and the future of timeline analysis...


    Melting snow, flash floods, and only a new 4n6time release ;-)

    $
    0
    0

    So where ever Kristinn Gudjonsson lives, there are apparently Flowers, blossoming trees and a new plaso release.That must be really nice. In Chicago we still have melting snow, flash floods, and only a new 4n6time release ;-)
     

    For anyone that saw me speak at the HTCIA conference in Minnesota a few weeks ago, you know I am VERY excited about the new version of 4n6time (and some other soon to be released tools to make your timelines epic!). Months of development and user feedback have been put into this release. There’s really too much to list about "whats new", so here’s a few of my favorite improvements:
    • Updated plaso engine to version 1.0.1-1 (alpha)– As Kristinn pointed out the latest version of plaso has many new enhancements and features. Also included are 2 new parsers contributed by me (thank you Kristinn for the help), Symantec AV and Google Drive!
    • Control plaso with a mouse! – Create your timeline(s) using a simple yet comprehensive user wizard. Create a timeline from a disk image, mount point, directory, CSV file, or body file! Also take advantage of plaso’s amazing file filtering and pre-filtering capabilities.
    • Tabbing – Because one timeline is never enough you can now view and jump between multiple timelines (subsequent to filtering) in tabs within the data grid view.
    • VirusTotal integration– In addition to right clicking on an event and Viewing it with a external file viewer, MD5 hashing it, or exporting it, you can now check to see if it’s a known file in the VirusTotal database (provided an internet connection).
    • Speed – The tool has more or less been completely refactored. It is 5x faster. This includes opening saved database files instantly (no more loading!).
    • GUI –  Enhanced User Interface, charting, filtering tricks, and reporting.
    • So much more!!!!
    It was almost a year ago, at the SANS DFIR summit, when Rob Lee gave me the opportunity to introduce 4n6time (then “l2t_Review”) to the community. I only had 360 seconds to show off the hundreds of hours of personal time I spent learning and developing the initial proof of concept.


    Almost a year later, I am overwhelmed by the response from the community. 4n6time has been nominated for the 2013 forensic4cast award for the “Computer Forensic Software of the Year” and there are hundreds of folks using the tool all over the world. This has made every minute working on the project all so worth it.


    As always, this project would not be possible without the existence and contributions to timeline creation tools. Special thanks to Kristinn Gudjonsson, Joachim Metz and othersfor development on log2timelineand now Plaso. Also a special thanks to Eric Wong who has been assisting me with the development these days.


    You can download the latest Windows version of 4n6time (0.4) on Google Code. Note you do not need to request a new cert file if you are an existing user, you can simply transfer your old cert file over to the new version following the directions in the FAQ. The FAQ is also a useful place for other common questions and getting started information. If you are completely new to plaso and/or 4n6time you may also want to check out the article Kristinn and I co-authored in issue 15 of Digital Forensic Magazine.


    As always happy to answer any questions and look forward to receiving feedback as development starts on the next release.


    Thanks!!!


    -David Nides (@DAVNADS)

    New weapon, Emailtime!

    $
    0
    0

    I often rely on timelines to tell the story. However it’s imperative to understand how the story was constructed to do this effectively.

    Thanks to tools like log2timelineand plaso it’s easy to create timelines! Like any tool it’s helpful to understand how these work.  I am not implying you need to start brogramming, but you should at least learn the capabilities of the tools. This primarily requires understanding what input modules or parsers are available (and how they are invoked). If you’re relying strictly on timelines for analysis this knowledge should enable you to understand if the "entire story” is being told.

    For instance, according to the timeline below, on March 4, 2012 at 00:28:17, a Windows Application (McAfee) Event Log entry was created. The description of this event states “The Scan was unable to scan password protected file 2011-W2.zip\\2011-W2.pdf. Scan engine version used is 5400.1158 DAT version 6498.0000.”

      
    Looking at the context of this event I don’t see any notable activity that could be contributable to the source of this event log entry. However, taking a step back from this timeline example, knowing what I am NOT seeing could equally important to what is shown…

    According to a 2012 Trend Micro report, Spear-Phishing Email: Most Favored APT Attack Bait, “91% of targeted attacks involve spear-phishing emails, reinforcing the belief that spear phishing is a primary means by which APT attackers infiltrate target networks.” Thus adding e-mail as a source in a timeline might be insightful.

    As displayed below, seconds before the event log was created, an e-mail was received. This e-mail contained the attachment “2011-W2.zip”.

    Now you probably want to know how e-mail magically appeared in the timeline above? At the SANS #DFIRSummit I introduced a new cmdline tool called Emailtime. The purpose of the tool is to create log2timeline CSV format timelines of PST files.

    The tool was written in Python and is packaged as an EXE for distribution. It requires you to download the Developers version of Redemptionas a dependency first. Oh, and run the Redemption installer as Administrator.

    Special thanks to Steve Gibson (@stevegibson) the ninja for helping pull this tool together. Note the tool is super ALPHA/BETA/WHATEVER so use at your own risk. We look forward to bug reports and feedback. I already have a short list of “to do” items including adding time zone offset and MSG support but didn’t want it to hold back releasing any further.


    The usage of the tool is pretty simple:

    Usage:
    emailtime.exe -p -e -H -F -S

    Additionally, as shown in the examples below it has some neat filtering capabilities. This allows you to target e-mails of relevance quicker based on e-mails that contain keywords, attachments, and/or hyperlinks.

    Examples:

    Export all emails:
    emailtime.exe -p "c:\outlook.pst" -e "c:\report\output.csv" -H "mycomputer"

    Filter emails with hyperlinks only:
    emailtime.exe -p "c:\outlook.pst" -e "c:\report\output.csv" -H "mycomputer" -F hyperlink

    Filter emails with hyperlinks and attachments only:
    emailtime.exe -p "c:\outlook.pst" -e "c:\report\output.csv" -H "mycomputer" -F hyperlink attachments

    Filter emails containing string evil only:
    emailtime.exe -p "c:\outlook.pst" -e "c:\report\output.csv" -H "mycomputer" -S evil

    Provided the output of Emailtime, a log2timeline CSV file, you can import it to a new 4n6time database for review (File > Create Database).  Alternatively, you can append it into an existing timeline database to overlay it with other timelines (File > Append Database).  



    EnCase via RDP (part 2)

    $
    0
    0
    As you probably already know, Remote Desktop Protocol and Encase Forensic do not play well together in Windows 7, Server 2008, etc. As posted a few years agothere are a work arounds but none are perfect. Even buying the NAS licensing server has limitations.

    ...I spent weeks trying to figure out a true solution.Then randomly, out of complete nowhere, a co-worker one day sends an email to our team () saying "Hey, if you ever have this problem with Encase and RDP .. just do this..." I was shocked, amazed, but more importantly it worked!

    Before you get started:
    • Note this program requires Administrative Rights to run!
    • Caution it requires User to Re-Login to RDP Session (user is not logged out)
    • Modified from http://community.spiceworks.com/how_to/show/873 and http://community.spiceworks.com/scripts/show/190-disconnect-terminal-services-session-remotely
    • I don't have time to support this but feel free to leave comments and I can see if my co-worker is interested in answering questions there.
    Directions:

    1. Copy the text below into a text file
    2. If you have EnCase installed somewhere other than the default location, you’ll need to update the section starting at line 23.
    set encase_v6x32="C:\Program Files (x86)\EnCase6\EnCase.exe"
    set encase_v6x64="C:\Program Files\EnCase6\EnCase.exe"
    set encase_v7x32="C:\Program Files (x86)\EnCase7\EnCase.exe"
    set encase_v7x64="C:\Program Files\EnCase7\EnCase.exe"

    3. Save as "Start Encase.bat"
    4. Just double click "Start Encase.bat" after connecting via RDP to the workstation.

    Start Encase.bat:
    @echo off

    :: EnCase Starter from RDP Session
    :: Author: ALG
    :: DATE: 2013.03.06
    :: Purpose: Fixes issue of EnCase starting in Acquisition Mode when executed from RDP Session
    :: Caution: Requires User to Re-Login to RDP Session (user is not logged out)
    :: Modified from http://community.spiceworks.com/how_to/show/873
    :: and http://community.spiceworks.com/scripts/show/190-disconnect-terminal-services-session-remotely

    :WinVersion
    cls
    echo ## Definig Windows Version
    ver>"%temp%\ver.tmp"
    find /i "6.0""%temp%\ver.tmp">nul
    if %ERRORLEVEL% EQU 0 goto ADMIN
    find /i "6.1""%temp%\ver.tmp">nul
    if %ERRORLEVEL% EQU 0 goto ADMIN

    :MENU1
    title Choose EnCase Version to Start via RDP (Requires Reconnect to RDP Session)
    :: EnCase Installations (Update to Install Location)
    set encase_v6x32="C:\Program Files (x86)\EnCase6\EnCase.exe"
    set encase_v6x64="C:\Program Files\EnCase6\EnCase.exe"
    set encase_v7x32="C:\Program Files (x86)\EnCase7\EnCase.exe"
    set encase_v7x64="C:\Program Files\EnCase7\EnCase.exe"
    cls
    echo 1: EnCase V6 (32-Bit) [%encase_v6x32%]
    echo 2: EnCase V6 (64-Bit) [%encase_v6x64%] 
    echo 3: EnCase V7 (32-Bit) [%encase_v7x32%]
    echo 4: EnCase V7 (64-Bit) [%encase_v7x64%]
    echo ---------------------------------------
    echo Type EnCase Version ID (above) or Full Path to EnCase.exe
    echo Type R to refresh user list
    echo Type Q to quit
    echo.
    set input=R
    :: Prompt for Install
    Set /P input=
    if /I %input% EQU Q goto END
    if /I %input% EQU R goto USERS
    if /I %input% EQU 1 set input=%encase_v6x32%
    if /I %input% EQU 2 set input=%encase_v6x64%
    if /I %input% EQU 3 set input=%encase_v7x32%
    if /I %input% EQU 4 set input=%encase_v7x64%
    set path=%input%
    goto USERS

    :USERS
    title Users on Localhost
    cls
    qwinsta /server:localhost
    echo.
    echo Type Session ID of current RDP session
    echo Type R to refresh user list
    echo Type Q to quit
    echo.
    set input=R

    :: Prompt for Install
    Set /P input=
    if /I %input% EQU Q goto END
    if /I %input% EQU R goto USERS
    set session=%input%
    goto DISCON

    :DISCON
    title Disconnecting User
    cls
    tscon %session% /dest:console
    echo Log off in process
    echo .
    goto STARTER

    :STARTER
    cls
    START /b "" %path%
    exit

    :ADMIN
    cls
    cd %systemroot%\System32
    if /I %CD% EQU %systemroot%\System32 goto MENU1
    goto ERR1

    :ERR1
    title Error
    cls
    echo This program requires Administrative Rights to run!
    echo.
    pause
    goto END

    :END
    exit


    4n6time v.05 - anyone know how I get a tax write off on this???

    $
    0
    0
    I been super busy and actually forgot to announce that I posted 4n6time, v.05 a few months ago. So here it is boys and girls. As always none of this would be possible without the tools that create timeline data (e.g. log2timelineplaso) and the help of MANY people.

    Before I get into what's new, I would like to quickly reflect. 4n6time was introduced as a proof of concept application demo'ed at the 2011 SANS 360 Summit and has grown into a global user base. In 2013, 4n6time was nominated for the "tool of the year" award by forensic4cast (vote again this year!).

    I remember joking that 4n6time would be free to everyone except LE. A lot of people laughed at that joke. However, in hindsight LE is one of my primary motivators to continue to invest personal time and expenses in this project.

    Mid last year I received an e-mail stating 4n6time was used to help prosecute a murder case by presenting a complex set of data to a jury in a way they could understand. A few weeks later I received an email that 4n6time helped a family understand the facts leading up to a suicide. I get testimonial emails like this all the time from people.

    Hearing feedback that Davnads potentially impacted someones live is surreal. It really is. Now only if I can figure out how to get a tax write off on this??? Lol.

    The general feedback I get is that 4n6time does not make evidence available that other tools do not. It just makes evidence more readily accessible, presents it in a way that is logical, and makes telling the story easy with a mouse. In fact I think the download counts from last year speak for themselves. Although I suspect Kristinn would argue that the logs all point to Davnads downloading his own tool ;-)


    I guess the reason I am sharing this story is to encourage others to contribute to existing projects like plaso or new projects. Everyone has to start somewhere and you never know where it will end up. I am also sharing this to thank people for the feedback. If it wasent for the emails, challenge coins, patches and other swagg I probably would have stopped investing in this project a long time ago.

    Now let's take a look at what's under the hood in 4n6time, v.0.5...
    • Contains latest "release" of plaso v.1.1.0 and dependencies. 
    • More intuitive create timeline wizard with ability to enable parser(s) visually amongst other enhancements.
    • Ability to interact with all charts (e.g. click on source and update data grid view to only show source).
    • Mouse hover over "tool tips" on all major buttons.
    • Filter query preview (e.g. how many/types of results will be returned).
    • Filter pivoting in data grid view based on various time criteria.
    • Enhanced charting and reporting.
    • EVT ID look up / deeper VT integration.
    • More export to CSV options.
    • Every time data is added to database prompts for evidence number. Used to differentiate multiple data sources in timeline.
    • Advanced filtering.
    • Lots of GUI enhancements and better error handling.
    • Proof of concept MySQL back end - this adds a collaborative (server/client) review approach to timeline analysis. Also allows to scale timelines a lot more efficiently. 

    Note: There is a beta linux version (thanks to Kristinn Gudjonsson). This should be part of new SIFT 3.0. The OSX version has not been compiled yet. I'll try to get this done in the next few weeks.



    4n6time v.06 - minor update

    $
    0
    0
    I posted a new version of 4n6time for Windows only. Download link here:


    Not many significant changes. Below is a short summary.
    -Using latest plaso "release v.1.1.0" source code base dated early June. Also includes newer versions of the plaso dependencies dated as of early August. 
    -Lots of bug fixes and minor GUI tweeks.
    -Extended image support (consistent with plaso) for timeline creation. Note file interaction is only supported with Raw disk images atm. 
    -Enhanced timeline creation wizard (e.g. disk scanner implementation, parser selection gui, etc.)
    -New window/pane to monitor plaso timeline creation process. 
    -Lots of other little things, minor speed improvements, etc.

    To be honest I did not do as much testing this time around then previous releases so encourage feedback, issues, questions, bugs, whatever just let me know. I just didn't want to delay the release any further. 

    I'll also try to work with Kristinn when he gets some time to try to create a linux / SIFT 3.0 release!
    Viewing all 33 articles
    Browse latest View live