Week 33 – 2016


  • Michael Maurer updated EFetch to Beta 0.5. The update turns EFetch into a file analysis tool for log2timeline.
    Efetch 0.5 Beta is here! Now all I need is a couple beta testers…

  • Sarah Edwards at Mac4n6 has updated her MacMRU parser to support the ‘Most Recently Used’ artefacts for Microsoft Office for Mac 2011 and 2016. On a side note, I played around with it a little bit during the week as I had a recentitems plist that contained both the old style ALIAS and new BOOKMARK blobs. This also led me to found out that the script doesn’t support Unicode strings yet; Sarah said she hopes to add this functionality soon. Overall, the experience drums in the need to always verify your findings – in this instance, a quick check with a plist viewer highlighted that the items containing Unicode strings were causing the loops to finish early.
    Update to MacMRU Parser – Now with Microsoft Office Support!

  • X-Ways Forensic 19.0 Preview 9 was released during the week. This version Ability to view and preview files with the viewer component that are larger than 2 GB, updates the maximum number of search terms from 25 to 50, allows for image files to have the same name in the same case, and adds the “extraction of Windows PowerShell events and their most important values from Windows event logs and output to the event list”.
    X-Ways Forensics 19.0 PR-9

  • GetData’s Forensic Explorer was updated to version v3.6.2.5646, fixing a bug in the bookmarks module’s gallery view.
    Forensic Explorer Download Page




  • This week’s episode of the Digital Forensics Survival podcast covers the OS X program File Juicer. File juicer is a cheap file carving tool that allows you to  drag your file onto the executable and it will carve what it can.
    DFSP # 026 – File Juicer

  • On this week’s Brakeing Down Security podcast Bryan and Brian discussed hacker summer camp, travel security (although funnily enough, this came out before we found out the security incident wasn’t a security incident), and their CTF, explaining the various problems they ran into when setting it up and also how to go about solving it.
    2016-032-BlackHat-Defcon-Debrief, Brakesec_CTF_writeup, and blending


  • Michael Karsyan at Event Log Explorer has written a post describing the variety of filters available in the tool.
    Filtering all the way

  • Dan Pullega at 4n6k provided a short post on determining the PowerShell version from the Windows registry. Apparently, on Win10 systems there may be two different subkeys pertaining to two different versions.
    Forensics Quickie: PowerShell Versions and the Registry

  • Patrick Olsen at System Forensics has reversed the Alias Version 3 format and provide a Pythonic means to parse it. Patrick also goes into explaining the various fields. Having recently been looking into various plists, there’s a wealth of information in the various Alias and Bookmark blobs so having a scripted means of extracting it is very helpful. I look forward to the next post on Bookmark data.
    Reversing Mac Alias v3 Data Objects

  • Brian Baskin at Ghetto Forensics shares his solution for the Palo Alto Networks CTF.
    Running the Labyrenth: Unit 42 CTF

  • Adam at Hexacorn showed how adding a DLL to the “Terminal Server Client” registry subkey on Win10 will execute said DLL when a user connects to the computer remotely or the mstsc executable is run.
    Beyond good ol’ Run key, Part 44

  • Igor at Weare4n6  wrote an article on data acquisition from damaged hard drives. The article provides a brief overview of FTK Imager, Victoria, EPOS Bad Drive Adapter, Atola Insight Forensic and PC 3000 Portable. Using a combination of these tools you are able to acquire data from a number of different drive issues.
    Extracting Data From Damaged Hard Drives


  • Philippe Lagadec has created a custom Google search to help :find malware samples containing specific strings, filenames, hashes or other IOCs”
    Malware Search

  • Adrian at Bit Therapy wrote two articles this week
  • Eforensic mag posted a reverse engineering for malware analysis cheat sheet originally posted by Paul Rascagnères here
    Reverse Engineering For Malware Analysis Cheat Sheet by @r00tsb

  • Luis Rocha at Count Upon Security provides an analysis of the Dridex Loader using Ollydbg. Luis explains how to extract the unpacked sample to disk, and then the various steps that he takes to examine the executable. He identifies the file as a PE file, then hashes and cross checks it with the various malware repositories, runs strings, and then analyses the PE headers with other tools. Luis determines that the executable has added some obfuscation and continues to show the viewers how to decode the data to “get the list of libraries that are used by the binary”.
    Malware Analysis – Dridex Loader – Part I

  • Hasherezade returns to the Malwarebytes Labs blog with a technical analysis of the Shakti Trojan.
    Shakti Trojan: Technical Analysis

  • Similarly Josh Reynolds at Cisco has a technical analysis of the CryptXXX ransomware.
    CryptXXX Technical Deep Dive

  • Sean Wilson at PhishMe has started a series on examining the “Anti-Analysis techniques being included within Office macro and script files”. The first technique checks various settings including the username, and the number of files they’ve accessed before sending some GET requests. Sean concludes that this technique can be used to prevent automated analysis in sandboxed environments.
    Macro Based Anti-Analysis


  • Jared Atkinson at Invoke-IR has a post describing the process of installing PowerShell onto a Mac. He explains that Microsoft has recently open-sourced the framework and through a bit of configuration tweaking he was even able to make it look like the Windows version.
    Installing PowerShell on OSX

  • Rob Lee and David Bianco wrote a white paper on the SANS Institute InfoSec Reading Room covering “three types of hypotheses and outlines how and when to formulate each of them”.
    Generating Hypotheses for Successful Threat Hunting

  • Brett Shavers shared his thoughts on the value of books in the forensics world. The various books that have been written work as an invaluable guide to understanding how systems work and artefacts are produced. One of the main takeaways from this post is “if you can’t find the book that you think should be written, then best start writing”.
    The Value of a Good Book in the Forensics World of Things

  • Patrick J. Siewert at Pro Digital Forensic Consulting has written a post on the benefits of providing your friendly neighbourhood forensicator a sufficient amount of time to perform their examination, and provide the information that you need for your case/investigation. Ultimately, if the examiner is provided little time then they will usually provide little results. They will be able to very confidently talk about the topics they know, but anything that requires additional testing and research will have to be left as a To-do item. And when the request for assistance includes a phone or two, that can introduce a whole host of other problems that require time, testing, training etc.
    Sooner Rather Than Later… Please!

  • DFIR Guy at DFIR Training shares his thoughts on degrees in DFIR. It’s quite a long post, but the TLDR is; get a degree in DFIR if you need it for some reason (ie to get past HR), or someone else is paying. A degree in a related field, computer science, software engineering, electrical engineering, for example, would probably be more beneficial as they teach a lot of the core competencies that you can use when building your DFIR knowledge. In Australia, a majority of the DFIR degrees are post graduate level, however, they don’t really endow people with all of the relevant skills required to start working from day dot. Ultimately I think the best bit of advice is if you’re interested in getting into DFIR is to get a few books, and a few blogs (or just mine 🙂 ) and get reading and practising. As David Cowan mentioned a few weeks ago in the Forensic Lunch you can conduct a fairly good forensic examination using completely free, open source tools.

  • The Forensic Focus Twitter chat was held during the week but it was 5AM my time, so I’m wasn’t waking up for that. On another note, it would be great if the moderator would compile the discussions into a post, which I understand is a lot of work (which is why I’m not doing it myself).
    There were 10 questions asked to the community and quite a few responses. The initial questions asked were “What forensic software do you currently use?” and  “Is “push-button” forensics a big problem among software users?”, but the discussion also moved to vendor training and requests for improvements that practitioners would like to see in DFIR software.
    I wanted to provide a few answers here as, as mentioned earlier, 5AM is a bit early (and as you’ll see, 140 characters is too much of a constraint).
    Regarding the software that I use: I use a variety of different tools ranging from free, and open source to the paid suites. I decided a while ago that whilst the paid tools are available I would prefer to use the FOSS alternatives where possible. I find that I can modify these with my existing skills (I know enough Perl and Python to kluge together other people’s code to do what I want, test it and make sure I understand what’s going on), and put things into a rational coherent report. I’ve also found with the FOSS tools, a majority of the developers are very agile and happy to help, provided you can give them the test data to work with. I do this, and I’m sure other people do as well, but sometimes you will parse an artefact with a tool, it provides some sort of error, and instead of telling the developer you move on and find another tool that works. This leaves us with a large number of tools that work for a subset of use-cases.

    Moving on to what I would like to see in DFIR software: Ideally I would like a system that allows me to run a number of different tools across an artefact at once, preferably into both its usual output, and a timeline output. I could then review all of this information and compare and contrast what each tool provides, and where it fails. Currently, it’s a very manual process. The other thing I really want is a good way of creating small timelines from a variety of different parsers, and I can see the programmatic solution, I’ve just not as good a coder as I need to be to do what I want.

    Regarding push button forensics; I’m in two minds about this. On one hand, working in law enforcement we are provided with a number of different device types and are expected to be able to provide competent evidence for court. We examine DVRs, mobile phones, computers, tablets, etc. These devices may be supported by tools, or they may not be. They come working, broken, password protection or just new enough that nothing supports extraction or parsing. We rely quite heavily on the work of others to get our job done faster as we often don’t have nearly enough time to do all of the background research for every job. Sometimes you have to give up looking for the password to the encrypted volume, or looking for that deleted file because every minute you spend on that you’re not spending on another equally important matter. Sometimes getting 80% of the data is better than nothing when there’s a time constraint, especially with the increasing number of digital devices that people own, and the increasing amount of storage available.
    On the other hand, push button forensics usually only gets you some of the way there. You can triage a system quickly, and get a good idea of what’s going on, but the automated tools usually miss or miscategorize things, often not through a fault of their own. Usually, it’s because not every tool writer or vendor has seen every single possible iteration of the data. They do the best they can with what they’ve got. Sometimes tools that do almost what you need them to do are released and you have to go through and analyse the data manually. That’s where developing your coding and data reproduction skills come in handy. No doubt a problem that you have had to work through manually can be scripted, and the developer will usually be happy to update their code to pass on your knowledge.
    Join The Forensic Focus Twitter Chat, Wednesday 17th August: Software

And that’s all for Week 33! If you think I’ve missed something, or want me to cover something specifically hit me up through the contact page or on the social pipes!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s