Sunday, 13 December 2009

Navman S30 Satnav device

I have seen a number of different Navman models now. All of them have subtle variations on how to process them. Although the underlying OS is Microsoft WinCE.NET 5.0 Core Version I did not have to access this device via Mobile Device Centre. After applying some write blocking to my USB port this device was accessible as a mass storage device allowing me to image it with Encase. I ended up with 114,759 sectors being imaged.



The next step I took was to track down and consult a manual for the device in order to establish its capabilities. I expect all devices to have saved Home, Favourites and Recent destinations and this device was no exception.



In addition however this device can save trip logs, pre-planned itineraries, pictures and be paired with a mobile phone.


Almost all the relevant data is stored within xml formatted files. Microsoft Excel 2007 is an excellent tool for examining these files and subsequently reporting on them. I use the Get External Data / From Other Sources / From XML Data Import option via the data tab and allow Excel to sort out the formatting.



A good place to start is the file paths.xml stored at the root of the partition. This file details the location of some of the relevant files.

This is a more definitive list:

  • MyFavouriteLocations.xml - used to store the home location and favourites
  • MyRecentLocations.xml - used to store Recents and also Journey Starts
  • MyMultiStopLocations.xml - used to store saved multi stop journeys
  • MyRoute.xml - used to store the current journey which is in effect the last journey -on the device I examined this file was deleted but recoverable
  • UserSettings.xml – used to store device settings including where the unit was turned off

When a user enters a new address the menu shows previously entered towns or cities, road names and postcodes. This data is stored in the following files:

  • DWRecentPocode.xml – Previously entered postcodes, most recent first
  • DWRecentRoad.xml – Previously entered road names, most recent first
  • DWRecentPlace.xml – Previously entered towns or cities, most recent first

There are a number of presumably back up files also containing the same (as far as I could see) xml formatted data:

  • MyFavouriteLocations_bak.xml
  • MyRecentLocations_bak.xml
  • MyMultiStopLocations_bak.xml

There are also two files that appear to be temporary files which were deleted but recoverable, containing xml formatted data:

  • MyRecentLocations_New.xml
  • MyMultiStopLocations_New.xml

All of the above mentioned xml files are parsed very tidily using Microsoft Excel 2007. I use the same program to create an html version of the worksheet after a little tidying up. The longitude and latitude values need to be divided by 100,000. I populate a new column using the formula:

=HYPERLINK("http://maps.google.co.uk/maps?q="&(K3/100000)&"+"&(L3/100000)&"","Click here to view in Google Maps")
The cell K3 contains the Latitude and L3 the Longitude. The formula creates a clickable hyperlink to the Lat/Long in google maps.

There are one or two other files of interest:

  • destdata.dat - which contains the address used for the last navigated journey
  • gpslog.ini - detailing the location of trip log data
  • default_settings.xml -which in the FAVOURITES/ RECENTS/ MULTI-STOP section appears to detail the maximum number of favourites and recents
  • .pcd - on the unit I examined I could not locate a .pcd file however I understand from Andy Sayers that this file if it exists contains the phonebook from a paired mobile phone
  • Log001.log - again I did not see this file but if it exists it contains GPS track logs

Because we have physical access there is also a possibility of recovering relevant data from unallocated clusters. I located records in unallocated using the keyword <lat> .

References
Sat Nav Examination Guidance Notes (Andy Sayers)
Navman S-Series (S30, S50, S70 & S90i) User Manual


Monday, 7 December 2009

Binatone X350 UK&ROI 2nd edition GPS

This device can be purchased very cheaply now from places like Asda and ebuyer. It runs Astrob Turbodog4 satellite navigation software within a Microsoft WinCE.NET 5.0 Core Version OS. Although I have not examined one I believe a number of Navigo devices run similar software. It has an SD card slot which was unpopulated in the one I looked at. The internal memory can be accessed like many similar devices via Mobile Device Centre in Vista which makes available a volume entitled ResidentFlash. I disable writing to USB devices by modifying the registry (there are many utilities about to do this). Simply paste the text below into a text file, give it a .reg file extension and then execute it and then reboot.

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\StorageDevicePolicies]
"WriteProtect"=dword:00000001

After copying and creating a logical evidence file of ResidentFlash I found three notable files within the MobileNavigator folder:

  • RecentDest.dat
  • FAV.DAT
  • SystemSet.dat

RecentDest.dat stores up to fifty of the recently navigated to locations. These locations are stored in records of 104 bytes in length. The first record starts at the first byte of RecentDest.dat, so by viewing the file in Encase with the view pane set to hex and dragging the view to show 104 bytes per line (assuming you have twin monitors), it is possible to see all the relevant data. Each location record stores Longitude and Latitude as 8 byte doubles which unfortunately Encase does not natively decode. The data interpreter in Winhex can do this. The hex editor 0xED on a mac can also do this but rounds up to fewer decimal places than Winhex. So given a fully populated RecentDest.dat file you have one hundred doubles to decode. I turned to my friend Oliver Smith over at Cy4or who wrote me an enscript which parses out the records to a csv file. Email me with a brief note about who you are for a copy

Recent Destinations also log the time entered, so very unusually some meaningful time and date information may be extracted. Providing these times have been recorded whilst the device can see satellites they are accurate and stored at the configured time zone. If a destination is entered when the unit can not see the sky and if the battery had been previously discharged it appears that the recorded time and date would be soon after 00:00 hours 1st January 2007.



I have decoded each 104 byte record as shown in figure 1


Figure 1 (click on image for larger version)




FAV.DAT contains user configured Favourites stored in 536 byte records. Once again the Longitude/Latitude are stored as eight byte doubles in the first sixteen bytes of each record.

SystemSet.dat appears to store the users Home location and again the Longitude/Latitude are stored as eight byte doubles.

Within the MobileNavigator folder there is a folder entitled Trace. This was empty in the one I looked out however the manual states:

The unit is capable of logging all positioning information received from the GPS satellites during navigation. It then uses this information to draw a track of the route on the map. This enables you to review the route information at a later time.

I imagine that should this feature be enabled a file of some sort will be stored in this folder.






References

http://binatonegps.com/gps/download/manual/X350II%20User%20Guide%20(Turbo%20Dog)%20-%2020080910.pdf
http://en.wikipedia.org/wiki/Double_precision_floating-point_format


Friday, 13 November 2009

Sony PSP internet history

A recent case resulted from an entry in a compromised web server log. The GET request included the string "Mozilla/4.0 (PSP (PlaySation Portable); 2.00)". Our suspect had used a PSP to do dodgy stuff and the PSP eventually came my way. I looked around for some information but did not find a large amount of information, essentially the most useful items were an Encase Message Board post and Chapter 9 of a book entitled Advances in Digital Forensics V which I read via Google Books.

Sony PlayStation Portable hand held consoles have an inbuilt wi-fi adaptor and can therefore connect to the internet. The device utilises the Netfront browser. There are a number of different versions and firmware versions. The one I looked at had a label indicating that it was a PSP1001. This site details the many different types available. A PSP1001 is known as a PSP Fat (as opposed to a PSP Slim). The one I looked had version 4.05 firmware. These type of PSPs have a small amount of internal NAND flash memory and a Memory Stick ProDuo flash media card.

As far as I can ascertain it is not possible to examine the internal NAND memory of devices beyond 1.5 firmware because you would require hacked firmware and modified hardware to do it. The browser does store its cache in this area but I believe as a default only 512KB is used for this purpose. Some information can be derived from the internal memory via a manual exam. Essentially then, we are left with the Memory Stick ProDuo flash media card. Our Tableau USB write blocker would not recognise the card I had however I was able to image it using our Helix imaging box and Guymager. The card had a FAT16 file system and was examinable with Encase.


Files of interest
On the card I looked at only two files were of interest both in the folder \PSP\SYSTEM\BROWSER.

bookmarks.html contained what you would expect -user created bookmarks
historyv.dat contained internet history

Scott Conrad, Carlos Rodriguez, Chris Marberry and Philip Craiger's paper within Advances in Digital Forensics V refer to two further files of interest historyi.dat and historys.dat. I got my hands on a test PSP1001 with the same firmware as my suspects (4.05) and in testing I was not able to populate these files with any data. The files existed but I was not able to cause details of either Google Searches or user typed URLs to be stored in these files. My suspects card had an unpopulated historyi.dat file and no historys.dat file. As noted by Conrad et al I found in testing that I could only cause writes to historyv.dat by shutting the browser down gracefully. Simply turning off the PSP without shutting down the browser did not commit that sessions history to historyv.dat.

Structure of historyv.dat
The structure of historyv.dat is discussed by Conrad et al however they suggest that elements of the file were best decoded by introducing the data into a test PSP for decoding. For example the date of each history entry could be ascertained this way. I would prefer to carry out a completely static examination if possible, not least because on my suspects card I had recovered a number history records in slack space and a manual examination can be a little laborious. I have therefore decoded the records a little bit further as shown below at Figure 2 and Figure 3. Each historyv.dat file is headed with 66 bytes of data starting with the string Ver.01. Within this 66 bytes are two further bits of plain text - NFPKDDAT and BrowserVisit. Immediately following BrowserVisit is the first history record. The most recent record is listed first, the oldest last. Each record can be located using a GREP expression to search for the header - in Encase \x03\x00\x01\x00 -see Figure 1 below. Records can be found in slack space and unallocated clusters.

Click on image for larger version

Figure 1


Figure 2


Figure 3


A significant addition to the research of Conrad et al is the decoding of the date for each record. The date is recorded in the two bytes following the URL and is stored Little Endian. In Encase sweep these two bytes and right click, select Go To and check Little-endian. The value is the number of days since the Unix epoch (1st January 1970). This web site provides a good date calculator.
IMPORTANT NOTE re dates: the dates stored are in accordance with the PSPs internal clock. The clock resets when the battery is exhausted. With the firmware I looked at the reset date was 1st January 2008. This date is 13879 days from the Unix epoch. I speculate that the average user is unlikely to reset the date each time the battery exhausts, therefore I would expect to see a lot of dates in January 2008.

References and thanks
http://www.edepot.com/reviews_sony_psp.html
http://en.wikipedia.org/wiki/PlayStation_Portable#Web_browser
Forensic Analysis of the Sony Playstation Portable - Scott Conrad, Carlos Rodriguez, Chris Marberry and Philip Craiger
https://support.guidancesoftware.com/forum/showthread.php?t=33057&highlight=psp
http://www.computerforensicsworld.com/modules.php?name=Forums&file=viewtopic&t=654&highlight=psp

Thanks to Pete Lewis-Jones and Simon Maher for their help brain storming the date problem


Sunday, 1 November 2009

Garmin Streetpilot C510

I blogged earlier this year about the Garmin Nüvi 200 Sat Nav device and I have now had a crack at a Garmin Streetpilot C510.

The Streetpilot like the Nüvi 200 stores waypoints in a file Current.gpx found within the Garmin folder. This folder is accessible when the device is connected to a computer due to the fact that the device is designed to act as a mass storage device. It is probably worth expanding on what a waypoint is. Garmin's FAQs define them as

Waypoints may be defined and stored in the unit manually, by taking coordinates for the waypoint from a map or other reference. This can be done before ever leaving home. Or more usually, waypoints may be entered directly by taking a reading with the unit at the location itself, giving it a name, and then saving the point.

Essentially as far as both the Garmin devices discussed here are concerned the waypoints recovered from Current.gpx are the users favourites and home location. Apologies for teaching granny to suck eggs but it is probably worth stating that waypoints are not Track Logs. Most Streetpilots and Nüvi 200s do not store any tracking information (there is an unsupported hack which allows the modification of some units firmware to store tracking information).

As commented in my previous posting and within the SatNav forensics forum over at Digital Detective these Garmin devices do store other data not contained in Current.gpx. This data is the Recently Found locations which are effectively the last fifty locations a user chose to navigate to (or at least look at on the device). Evidentially this data may be useful. Up to now a manual exam using something like Fernico ZRT has been the answer. I have tried out a slightly different methodology.

Suggested Methodology for the examination of Garmin Streetpilot C510
(May work with other models)

  • On your Forensic Examination workstation run the Garmin USB drivers executable and work through to this screen

  • Connect Garmin sat nav to your Forensic Workstation and complete the USB driver installation (the sat nav must be displaying the hidden service mode - if it isn't it will act as a mass storage device)

  • Run G7toWin on your workstation (it does not need to be installed) and adjust the configuration to allow communication via USB

  • Within G7toWin via the menu bar select GPS/Download from GPS/ All
  • All available waypoints will display
  • Via File/Save As you can save the data to your filetype of choice (e.g. .gpx, .kml, .xml)
  • It is possible that one of the fields may contain an illegal character - in my testing the comment field did. I dealt with this in my exported kml and xml files with a decent text editor (PSPad) and the find and replace feature. Applications that support xml and Google Earth are not usually tolerant of any illegal characters/formatting.

Downloading of the waypoints is now taken care of. Next I want to deal with the

Recently Found

locations. I am going to suggest two approaches, which although relatively simple I have not seen documented elsewhere. The version of the device you are using may dictate which approach you try.

  • Approach 1
  • You should still be at the Diagnostics Menu - press the Exit icon
  • Via the main navigation menu select Where to?/ Recently Found
  • You should now see the first five Recently Found locations
  • On your Forensics Workstation launch xImage, your device should appear in the Device field then click Next
  • Select Get Images from the GPS then click Next
  • Set Image Type to Screen Shot
  • Clicking Next will allow you to save a screen shot of the currently displayed screen on the device
  • Using this method you can quickly screenshot all the screens you would have photographed in a manual exam, after each screenshot click back to prepare for the next one

Approach 2 is more invasive, however I think principal 2 of ACPO guidelines applies.

  • Approach 2
  • You don't initially have to have your device connected to your workstation for this to work
  • On the device select Settings/ Display
  • In the Display menu enable Screen Shot
  • This will cause a small camera icon to appear in the top right of the display
  • Pressing this icon will cause a screen shot to be saved into the Garmin/scrn folder upon the device
  • Screen shot all the screens you would have photographed in a manual exam
  • Connect device to workstation as mass storage device and cut and paste screenshots from it

UPDATE RE GARMIN Nüvi 310

Artemus has been looking at a Garmin Nüvi 310. He tried Approach 1 above and found that to enter the diagnostics mode he had to push and hold the top right of the display (as opposed to the battery symbol). HOWEVER he then encountered a message asking if he wished to delete all user data, so I guess for Nüvi 310 Approach 1 is a no go. So he tried Approach 2. He enabled the Screen Shot feature however on this device no camera icon appears. Screen shots are created by pressing the power button. Screen shots are saved into a folder entitled Screenshot on the media card.


Monday, 26 October 2009

TIM

TIM is an acronym for Tableau Imager which unsurprisingly is new imaging software developed by Tableau. Tableau promise astounding imaging speeds. Apparently it will be available in beta form anytime soon. Given the quality of the Tableau write blockers I think this software is definitely worth watching out for. The latest info can be found here.


Sunday, 11 October 2009

Video triage revisited

Back in July 2009 I blogged about the potential of video triage. I was commenting on its effectiveness and had used a program written by John Douglas to explore what was possible. Mark Woan added a very interesting comment to that post, introducing a program he had written - Forensic Video Triage. I have now tried out a series of enscripts written by Oliver Höpli which aim to provide the same functionality as both John and Mark's programs.

Essentially all three approaches utilise a third party video playing and manipulation program to create and store thumbnails of frames at set intervals throughout a video clip. The investigator can then triage the video clip by reviewing the thumbnails as opposed to playing the video. The gallery feature in Encase for example makes reviewing the thumbnails a considerably quicker experience than playing the videos.

John's program utilises VLC, Mark's uses ffmpeg and Oliver's enscript calls upon mplayer for thumbnail creation. Each of these video utilities have inbuilt codecs and their capabilities may vary - in other words a video clip may play with one and not the others.

Oliver Höpli has integrated the process much more closely with Encase with his suite of enscripts, and for me this can only be a good thing. If you are an Encase shop the pre-processing is considerably reduced and the whole process is more seamless leading to greater productivity. The main enscript runs across selected (as in blue checked) movie files within your case and parses out thumbnails into a logical evidence file. Another enscript creates a folder structure within the Encase bookmarks tab based on the contents of the logical evidence file. Each video clip has a folder within bookmarks making it an easy process to review the thumbnails.

To get it all working the main enscript needs a little configuration which I found a little fiddly. Ahead of time you need to install a version of mplayer suitable for Windows, the installer I used was MPUI.2009-07-24.Full-Package.exe. This appears to have been superseded by MPUI.2009-10-12.Full-Package.exe which is available here (at least today - download locations seem to change quite often). Oliver directs you to the standard mplayer site which I found a bit difficult to navigate. Once mplayer is installed you need to configure the main enscript by editing it to include the location of mplayer.exe and the location of a suitably large temp directory. On my box the lines of the enscript are (note the double \\)

////////////////////////////////////////// Configuration ///////////////////////////

//Path to MPClassic.exe
mpclassic = "C:\\Program Files (x86)\\MPlayer for Windows\\mplayer";

//Tempfolder which will be used to extract the movies and create the thumbnails
expDir = "C:\\Temp";

//time interlace between to frames
//films under 1 minute
OneU = 5;

//films between 1 and 5 minutes
FiveU = 10;

//films between 5 and 30 minutes
BetweenFiveAndThirty = 20;

//films over 30 minutes
ThirtyU = 30;

///////////////////////////////////////////////////////////////////////////////////

Oliver's enscript can be found in the Guidance Software Download Center and comes with a Readme which needs reading.

Sunday, 20 September 2009

Windows Photo Gallery

Windows Photo Gallery is built in to all Vista editions and allows the management of photographs and other pictures together with the ability to carry out a number of basic photo editing tasks. Two forensic artefacts of this program are discussed in this post.

Original Images Folder

The program allows users to revert to the original picture with one click should they not like the results of their editing. This feature provides investigators with a very useful artefact. When a picture has been edited the original unmodified version is stored at

%LOCALAPPDATA%\Microsoft\Windows Photo Gallery\Original Images

The file name of this original unmodified version is renamed - the relevant Microsoft Knowledge Base Article details the file name construction

When the original unmodified version of the image is saved, the image file is renamed by using a combination of a unique ID and the original file name. The unique ID is determined by theSystem.Image.ImageID file property. If there is no System.Image.ImageID file property value, a GUID is created. The following is the new file name construction:
'{' + unique ID + '}' + '-' + file name
The following is an example of a renamed original file:
{198EB054-44E6-441e-87C8-9B29C5198DE6}-image1.jpg

To example this I have edited and renamed the Windows sample picture Toro-toucan.jpg (quite apt considering the forthcoming Arthur's day) using Windows Photo Gallery



The Original Images folder is created the first time a picture is edited with the application and is a hidden folder. From a forensic point of view we might need to identify the edited picture which may have been renamed. We can locate the edited picture by searching for the unique ID referred to above. Essentially take the original file name:
{1F7BA35C-33F2-499E-92A1-0FBE9477C8CA}-Toco Toucan in my example)

and strip it down to

1F7BA35C33F2499E92A10FBE9477C8CA

This value is embedded within metadata stored within the edited file known as an XMP Message block and also in one further location. Using FTK Imager we can see this value stored in the two locations within the edited file (click on screenshots to see a larger version)


In the second screenshot part of the XMP message block can be seen. The editing application is also detailed.

Pictures PD4

Windows Photo Gallery stores metadata about the pictures indexed by it in a database file Pictures.PD4 at the location

C:\Users\YourUser\AppData\Local\Microsoft\Windows Photo Gallery.

Tim Coakley's Simple Carver Suite contains a program Windows Photo Gallery Viewer to parse this file. I have found that substituting a test Pictures.PD4 file (in a Vista VM lets say) with your suspects Pictures.PD4 file can produce some meaningful results. I found that the best results can be achieved when the test Windows Photo Gallery is set to display tiles view. Although a blog discussing the transfer of Pictures.PD4 files from machine to machine suggests that the test machines Volume Serial Number needs to match that of the suspects. This can be done with the Windows Sysinternals utility Volume ID v2.0.

References
http://support.microsoft.com/default.aspx/kb/944370
http://blogs.msdn.com/pix/archive/2006/08/16/702780.aspx
http://www.adobe.com/devnet/xmp/pdfs/XMPSpecificationPart3.pdf
http://aaron-kelley.net/blog/2008/03/migrating-vistas-windows-photo-gallery-database/







Monday, 17 August 2009

Vista Volume Shadow Copy issues

Volume shadow copies in Vista are often the elephant sat in the corner in many cases. We know they exist and we know they can contain lots of data, but we often choose to ignore them.
A recent case required some keyword searches and an examination of picture files. A the completion of the keyword search most of the hits were within files with names similar to
{bab9c293-d150-12dc-a44f-021d253da909}{3708876a-d176-4f38-b7bb-05036c6bb821}


The view pane within Encase 6.14 displayed the contents in a nice light blue colour which I now know is a new feature in 6.14 to indicate the contents of uninitialised files. The files were all located within the System Volume Information folder on the root of the volume and are the Vista Volume Shadow Copies. By default 15% of the capacity of the volume is allocated by Vista to store these copies. The C4P graphics extractor enscript carved most of the notable pictures out shadow copies also.
At this stage I have known examiners report their findings alluding to the fact that that the notable artefacts are within the file {bab9c293-d150-12dc-a44f-021d253da909}{3708876a-d176-4f38-b7bb-05036c6bb821}. In most cases I think you need to drill down further. In order to do this I mounted my Vista image with Encase PDE and used Liveview 0.7b to create a working VM using VMWare Workstation 6. Having logged into my suspects account I ran a command prompt as administrator and entered the command
vssadmin list shadows /for=c:\

This provided a nice list of available shadow copies. Having selected one I entered the command (updated 13th Jan 2010)

mklink /d c:\shadow_copy7 \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy7\


This created a symbolic link in the root of C which in Windows Explorer at any rate appears exactly like a shortcut to a folder. Clicking on it produced the error message shown below

I believed this error is probably generated by a permissions issue (SEE UPDATE BELOW), however I was not able to overcome it and Rob Lee over at Sans Computer Forensics suggests this methodology does not work. I think Jimmy Weg however has had some success with a program written by Dan Mares - VSS.exe. I therefore turned to ShadowExplorer version 0.4.382.0. This program allows the user to view the contents of Volume Shadow Copies that exist on any volumes within the installed system. The contents are displayed in an Explorer like view allowing the user to export out any file or folder to an export directory. I exported the User profile I was interested in to an export directory. Unfortunately it seems that only the Last Written date is preserved in this process and all other time stamps are tripped. I then tried to copy this export directory out of the VM to my workstation and encountered errors (probably due to files within the profile with illegal windows file names). To overcome this I zipped up the export directory and copied the zip out of the VM. Once unpacked I then added the exported folders into Encase as single files and created logical evidence files from them.
Having done this I was able to resolve most of my keyword search hits and pictures to actual files as opposed to being simply within a volume shadow copy.

UPDATE 13th January 2010
The issue I had with the mklink command was due to a missing \  but not the trailing slash referred to in some comments below.  The correct command is 

mklink /d c:\shadow_copy7 \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy7\




Sunday, 2 August 2009

You wait all day for a bus then two come along at once...

Probably not an entirely accurate title but I came across two enscripts the other day both of which are aimed at quickly triaging the results of a comprehensive Internet History search. Users of this functionality within Encase version 6 will know that often you can be faced with reviewing hundreds of thousands of entries on the records tab. Many times all you need is evidence of user inputted search terms. There are conditions available to start sorting the wheat from the chaff however it is difficult for these conditions to be totally focussed due to the variation in url formation. This is where both enscripts come in as they are both designed to parse the actual search term used from a variety of search engine urls.

Searchterms V 1.1 parses out the search term used and where possible the time and date it was carried out into note bookmarks. The enscript has been written to support a claimed 145 separate search engines.

Internet Search Term Finder parses out unique search terms to Log Record bookmarks and stores the term along with its associated url. The script is in fact an Enpack so it is difficult to determine exactly how it works, however it seems to base its search on elements from the query url. A neat feature is that it is configurable, allowing the addition of a new prefix (to the query string) to cater for a different or new search engine.

Within an XP Pro SP3 VM I carried out a series of searches utilising the Firefox v3.0.11, Internet Explorer v8, Opera v9.64 and Safari v4.0.2 browsers. I ran the Search for internet history Comprehensive search option within Encase 6.14 and established that all my searches had been parsed into the records tab, with the exception of those carried out with Safari v4.0.2. It turns out that Encase 6.14 does not support parsing internet history from this version of Safari.

I then ran both Enscripts and can report that both parsed out my test search terms from the records tab. The results can be viewed within bookmarks. For me the output of the Internet Search Term Finder is preferable and it usefully creates a Log Records bookmark which allows the easy export of results into a spreadsheet. Both successfully hit the spot in respect to users quickly reviewing search terms within internet history.

Update 15 Sept 2009
Dan Fenwick has kindly updated his Internet Search Term Finder (to v1.1.1). The script now can remove duplicates and separates the results by device. Even more useful - thanks Dan.


Wednesday, 22 July 2009

Link Files within System Restore Points

A recent case involved the download of a contraband file and I was asked to establish what happened just before the download in order to try and establish who was responsible. This scenario is fairly commonplace and I usually start with a timeline analysis of the file system activity around the event in question. An invaluable enscript for this is Geoff Black's Timeline Report which can also be found within the Guidance Software Enscript Resource center. The html report produced by this script is particularly cool.

In my case my analysis showed there were a number of link files in a number of system restore points all created at a time and date just before the download. They were all named in the form A000XXXX.lnk (xxxx being a variable number) and I could see from a rough and ready examination of the data that they all pointed to one particular file saved on the users desktop. As these link files were stored within restore points the first hurdle to overcome was to establish the link files original name and path. This information is stored within the changelog files of each respective restore point. Manually searching through this file for the restore point file name (e.g. A000XXXX.lnk) will reveal the files original path. There used to be an enscript for parsing the changelog files but it was written for version 5, however I was able to track down a version that worked in version 6 at Paul Bobby's excellent blog/web site (this enscript can also be found within the Guidance Software Enscript Resource Center). The changelog files contain a lot of information and all I really needed was the original filename and path - the scripts output may be a little bit of an overkill*. Another utility out there is the Mandiant Restore Point Analyzer. I used this utility to determine the original paths and file names.

All of the link files related to one link file stored within a users Recent folder. In my case the file the link file linked to was created and stored upon the desktop, thus causing the initial link file to be created. Over a period the target file was opened and additional information written into it, thus causing the link file to be updated. Harry Parsonage's paper on link files illuminates this further.

I copied the link files out of my image and loaded them into Sanderson Forensics Linkalyzer. This program decodes and displays the contents of link files into a grid much like a spreadsheet ( I was going to post a screenshot but sanitising the contents became too much of a pain) and very quickly allowed me to see that the target file was being regularly accessed and modified. Because the target file size is also stored within the link file I could also see that the file size was growing over time. The program produces good reports and has many other abilities beyond the scope of this blog post, but in short I thoroughly recommend it.

Now as far as my case was concerned the target file was clearly linked to the suspect and it proved worthwhile delving into those restore point link files.

References
http://computerforensics.parsonage.co.uk/downloads/TheMeaningofLIFE.pdf
Forensic Analysis of System Restore Points in Microsoft Windows XP

*Now what would be really useful is an enscript that simply parsed out the original path and filename of only all user selected (blue checked) files sitting in restore points. I would envisage the output to be a three column csv file - Current Filename, Original Path and Filename, Restore Point Creation Date.


Friday, 10 July 2009

Video Triage

Paul Sanderson's VidReport has been referred to here and there lately. C4M also is regularly brought up in conversations I have with people (such an interesting life I lead). Triage is certainly the flavour of the month right now. So I thought it worth writing a few lines about my recent experiences of triaging videos.

I have often voiced the opinion that reviewing a couple of hundred video files in a case is not that bigger a deal and on that basis I have not been too keen on using C4M. Anyhow I've just had a case with about 170 video clips to review and thought it would be a good case to try out the video triage approach. My normal approach is to use VLC as a file viewer in Encase to preview each video. This took about an hour and a half (some of the videos were quite good ;-) ).

I then used John Douglas's video triage program (which I think he supplies free to LE) to review the same video clips. To use this program you copy out the clips you wish to review and point the program at the folder containing them. It processes each clip by taking a screen capture at a configurable interval and putting each screen capture into a subfolder named after the videos file name. Once the program has processed all the clips you will have a sub folder for each one, containing the screen captures. I then simply dragged the folders into Encase as single files and previewed the contents of each folder in gallery view. I previewed all the clips in fifteen minutes. My scepticism of video triage was clearly unfounded.


Thursday, 18 June 2009

Bing in-line video previews

The new Microsoft search engine Bing has been in the news lately.

One of the facilities it provides is a video search which in itself is old hat, however the results page features in-line video previews. A user can turn safe search off and perform a search which results in a screen of thumbnails of the located videos. Hovering the mouse over a thumbnail results in a short preview of the video being played within the thumbnail.

The thumbnail videos are cached as FLV (flash video) files however the interesting feature is that the URL host of the flv files in my early tests was ts3.images.live.com. The ts3 part was variable. Microsoft are processing the video using Smart Motion Preview technology producing effectively a trailer of the most relevant parts. Microsoft on or about 12th June 2009 began to serve all explicit video smart motion previews from ts4.explicit.bing.net. The ts4 part is variable.

These in-line video previews allow the viewing of contraband material without leaving a significant footprint. At least for Bing the search query is saved within the browsers internet history and the smart motion preview is cached as a FLV file with the word explicit helpfully added into the cached items url.

The video search at ask.com also provides in-line video previews however these previews seem to be streamed - another kettle of fish altogether!


Friday, 29 May 2009

USB Prober

From time to time the subject of linking USB flash drives to a particular PC crops up. A week or so ago I saw a post on the Guidance boards touching on this subject and chipped in with a link to a paper referencing Harlan Carvey's original research in this area. The nub of this issue is that many USB flash drives have a unique device serial number which is recorded into the registry of Windows boxes that have hosted said flash drive.

When investigating this issue establishing a USB flash drives device serial number may be achieved by utilising a utility such as UVCView. In our lab we use the Tableau T8 USB write blocker to do this. When checking out the subject again prior to posting to the thread on the Guidance boards referred to above I discovered that my Mac Book Pro also has a utility that can establish a USB flash drives device serial number. The utility is an application called USB Prober which is installed as part of the XCode developer tools (which can be found on the separate DVD along with the Mac OS disc for those that have a Mac).

To use USB Prober for this purpose the Mac needs to configured so that it does not mount the USB flash drive. To do this disk arbitration needs to be turned off. In Leopard in terminal the command is:

sudo launchctl unload /System/Library/LaunchDaemons/com.apple.diskarbitrationd.plist

Once disk arbitration simply launch USB Prober (via spotlight is the quickest way) and drill down to the device serial number.

References
http://www.macosxforensics.com/Technologies/DiskArbitration/DiskArbitration.html
http://scissec.scis.ecu.edu.au/conference_proceedings/2007/forensics/23_Luo_Tracing_USB_Device_artefacts_on_Windows_XP.pdf
http://developer.apple.com/documentation/MacOSX/Conceptual/OSX_Technology_Overview/Tools/Tools.html


Tuesday, 5 May 2009

Helix Imaging PC

When we upgrade our Forensic Workstations we cascade the older machines onto administrative and imaging tasks. One particular ex Forensic Workstation had supported a tape drive for a year or two but now was about to become totally redundant. Instead of suffering this fate I decided to dedicate it to running Helix. The box itself is a Supermicro chassis sporting a Supermicro X6-DAL-TG motherboard, twin Xeon Nocona 3.4 ghz processors, 2GB ram and a hot swap drive bay.

I had read Andre Ross's blog post Installing Helix 2008R1 and Jess Garcia's How to install Helix to Disk webpage and decided that installing to hard disk was the way to go.

The process I followed to do this successfully (guided by Andre Ross's post in the main) was:

  1. Equip box with an unformatted wiped hard disk - using a partitioned (with ext2 and linuxswap) disk caused the installation routine to hang.
  2. Boot box to Helix 2008R1 CD and commence installation by going to System->Administration->Install
  3. At the point the installer hangs (Who are you screen) click cancel and then quit
  4. Commence installation routine again and create a user - I called mine Helix
  5. Configure Network Adaptor to connect to the internet via System->Administration->Network
  6. Launch Update Manager via System->Administration->Update Manager and update all packages.
  7. Applications->Forensics & IR->Root Terminal
    :~#apt-get install smbfs
    :~#apt-get install winbind

Part 1 of the job is done. A little bit of configuration is needed to make the machine more usable in it's main role as an imaging machine. I am not a Linux guru so apologies for the Janet and John approach for those that are. Also my imaging machines are in a secure environment and not normally connected to the internet so I felt relaxing security a little may be OK.

Relaxing Security

  1. System->Administration->Login Window
    On the Security tab you may wish to enable Automatic Login for the Helix user
  2. Applications->Forensics & IR->Root Terminal
  3. :~# nano /etc/sudoers
  4. Use arrow keys to scroll to end of file then type
    Helix ALL=(ALL) NOPASSWD: ALL
    (presuming helix was the name of the user account you created, if not substitute helix with the name of your account)
  5. Type CTRL+o to save then press enter then type CTRL+x to exit nano text editor. The syntax is critical - if sudoers is messed up your OS may not boot. The reason this is done is that most of the applications we wish to use run at root. However user accounts do not have root privileges. This is overcome by using the sudo command which periodically requires you to enter a password which is a pain. Editing the sudoers file as shown above removes the requirement to enter a password when sudo is used.
  6. By default there are three icons in the panel (like Windows Quick Lauch) on the taskbar at the top of the desktop (Firefox, help and terminal). Right click on Terminal and Remove from Panel.
  7. Access Applications->Forensics & IR->Root Terminal in the menu and right click and select Add to Panel

Imaging Applications

I work in an Encase shop so I am going to concentrate on applications that image to EWF format (aka e.01 files). There are currently two applications installed that do this - Linen and EWFacquire.

Linen

Linen needs some configuration to run from the shortcut Applications->Forensics & IR->Linen. This shortcut (I think the proper linux terminology is launcher) runs a script called sl in /usr/bin. sl needs editing.

  1. Applications->Forensics & IR->Root Terminal (or click on Root Terminal in the Panel)
  2. :~# nano /usr/bin/sl
  3. Use nano to delete the line
    cp /cdrom/IR/bin/linen /usr/local/bin
  4. Type CTRL+o to save then press enter then type CTRL+x to exit nano text editor.

At this stage Linen does not reside in /usr/local/bin - we need to put an up to date copy there.

  1. On a Windows box where Encase version 6 is installed copy the Linen file from the root Encase folder within Program Files to a thumb drive.
  2. On the Helix box copy Linen from the thumb drive to /usr/local/bin as follows:
  3. Launch root terminal from panel on task bar and mount your thumb drive by clicking on it's icon on the task bar and selecting Mount
  4. :~# cp /media/sdc1/linen /usr/local/bin (where sdc1 is your thumb drive)

Linen should now be launchable via the menu. But in true windows style I created a desktop shortcut by right clicking the Linen menu item and selecting add launcher to desktop.


EWFacquire

EWF Acquire is installed and will run from the root terminal. This program is part of the libewf project. The syntax is

ewfacquire /dev/sdb

where /dev/sdb is the drive to be imaged. Again I created a desktop shortcut by:

  1. Right clicking on the desktop and selecting Create Launcher
  2. Change the type to Application in Terminal
  3. Set the name appropriately
  4. In the command box type sudo /usr/bin/ewfacquire /dev/sdb
  5. Click OK


It is probably worth noting that you would not want to launch EWFacquire from the desktop launcher unless you had established the path of each drive by typing fdisk -l into the root terminal.


Guymager

Guymager is another imaging tool that utilises Libewf. It is controlled from a GUI and is a desirable addition to our imaging tools. I intend to do a mini review of it along with steps I have carried out to validate it in a forthcoming blog post. It is not installed on the Helix CDRom but can be installed to our hard disk installation.

  1. Launch a Root Terminal
  2. :~# nano /etc/apt/sources.list
  3. Use arrow keys to scroll to end of file then type deb http://apt.pinguin.lu/i386 ./
  4. Type CTRL+o to save then press enter then type CTRL+x to exit nano text editor.
  5. Whist still connected to internet type in root terminal
  6. :~# apt-get update
  7. :~# apt-get install guymager smartmontools hdparm libewf-tools

Once the process is completed guymager can be launched from a root terminal. Again I created a desktop shortcut by:

  1. Right clicking on the desktop and selecting Create Launcher
  2. Change the type to Application in Terminal
  3. Set the name appropriately
  4. In the command box type sudo /usr/bin/guymager
  5. Click OK

Guymager utilises a configuration file - guymager.cfg. For my setup I wanted to make some changes. The program advises that changes should be made to local.cfg, however I did not have much success with this. I edited guymager.cfg with nano:

  1. Launch a Root Terminal
  2. :~# nano /etc/guymager/guymager.cfg
    and modify entries to the following
  3. Language='en'
    EwfFormat=Encase5
    EwfCompression=Best
    EwfSegmentSize=1500
  4. and in the Table LocalDevices area add a new line beneath the line of ------------
    containing the serial number of the hard disk drive where Helix is installed
    e.g. '1ATA_Maxtor_6B300S0_B605MV0H'
    The best way to establish the serial no. is probably with Guymager itself.
  5. Many other changes can be made as documented within guymager.cfg
  6. Type CTRL+o to save then press enter then type CTRL+x to exit nano text editor.


Adepto


Although Adepto does not image to EWF files I know some people use it. Some changes need to be made to get it to work.

  1. Launch a File Browser with root permissions by launching a root terminal and typing nautilus
  2. Use the file browser to navigate to /home/helix (helix being the name of the user account I created during the installation routine - if you used another account name navigate to /home/theAccountNameYouUsed )
  3. Right click or use the edit menu to create a folder then name it Adepto
  4. Double click Adepto and create a subfolder within Adepto called Logs
  5. Right click on Logs and Make Link
  6. Right click on the resulting Link to Logs and Cut
  7. Navigate to /usr/local/adepto and paste your link file
  8. Right click on the existing Logs file and delete
  9. Rename Link to logs to logs

    Adepto should work now.

Some Networking Stuff

In our lab we image to a file server running Microsoft Windows Server 2003. When I have used the Helix CDs in the past it was always a pain to image to an attached hard drive then transfer the image to the file server later. I wanted the Helix Imager to image direct to our file server and be part of our Windows Workgroup.

To do this:

  1. via System->Administration->Network configure to connect to your internal network
  2. on the windows file server create a share (I called mine Helix) and create a user named Helixuser (having done this you can apply appropriate security to this user at the Windows end)
  3. Create a mount point to the windows share by:
  4. Launch a Root Terminal
  5. :~#mkdir /media/helix
  6. :~# nano /etc/nsswitch.conf
    modify (add wins prior to dns) the following line to read

    hosts: files mdns4_minimal [NOTFOUND=return] wins dns mdns4

    Type CTRL+o to save then press enter then type CTRL+x to exit nano text editor
  7. :~# nano /etc/fstab
  8. Append the line below to the end of the fstab file

    //server/Helix /media/helix cifs username=user,password=*,iocharset=utf8,file_mode=0777,dir_mode=0777 0 0

    where server is your server name, Helix is the name of your Windows share, helix is the name of the linux mount point, user is the name of an account on your Windows server and * is substituted for whatever your password is.
  9. Type CTRL+o to save then press enter then type CTRL+x to exit nano text editor
  10. :~# mount -a
  11. Configure the way the Helix Imager box is recognised within our Windows Workgroup
  12. at the root terminal :~# nano /etc/samba/smb.conf
  13. Within the global settings area modify entries to the following
    workgroup = THENAMEOFYOURWORKGROUP
    server string = %h

Now that a mount point has been created to your windows share specifying /media/helix as the path to image to in Linen, EWFacquire or Guymager will output the image to the Windows File Server.


Saturday, 25 April 2009

Tableau T9 Firewire write-blocker

Most forensic practitioners will prefer to use hardware write-blockers over software. However when the device you wished to image only had a firewire interface the choice was limited. Hardware write-blockers for firewire didn't exist. Now somewhat late in the day Tableau have introduced the Tableau T9. This write-blocker will allow you to image firewire external storage drives as well as Apple Macs booted into target disk mode. Given the increase of Macs submitted to our lab I can see the T9 becoming very useful. Data Duplication will sell the T9 in the UK for around £240.


Wednesday, 22 April 2009

Facebook revisited and other chat related stuff

My blog post about facebook chat generated a lot more email than usual.

In particular Jad Saliba wrote about a program he has written to search for and report on facebook chat. Jad's program is called Internet Evidence Finder and essentially at this time it searches for Facebook chat, Facebook pages, Yahoo chat and MSN chat. Jad points out that the program may be useful in a non Encase shop and I agree. In fact it will be useful anywhere as it did a very good job.

I have had some fun testing it today and found that it parses all the messages that my two previously documented methods had found. I used the program by mounting the drive image I wished to search with Encase PDE and then running the program across the mounted drive. On my box the search ran at a speed of about 27 MB/sec. The resulting spreadsheet was nicely formatted and gave the Physical Sector of each hit. Jad's program is freeware and can be found at http://www.jadsoftware.com.

With respect to MSN chat and the other chat clients Jad's website deal with what can be achieved. In testing I am running right now with MSN a large number of false positives have been found however this is probably the nature of the beast.

Now before someone mentions tool validation my view is that I don't validate my tools - I validate my results. Generally I do this with dual tool verification as in the example above.

Till next time...


Wednesday, 8 April 2009

Adobe Flash Player Local Shared Objects

The value of cookies and other internet history related artifacts is well known. Not as widely commentated on are Local Shared Objects created by Adobe Flash Player. They have a .sol file extension and on the vista box I am looking at at least they are stored at:


\Users\your user\AppData\Roaming\Macromedia\Flash Player\#SharedObjects\

These Local Shared Objects are data files that can be created on a computer by visited websites and in many respects are similar to cookies. It appears however that the conventional forensic software I use to analyse internet history ignores these files (I use Netanalysis and Encase v6 Comprehensive Internet History search).

To parse .sol files into a more readable form I use the SharedObject Reader plugin of FlashDevelop3.

References
http://www.adobe.com/products/flashplayer/articles/lso/
http://en.wikipedia.org/wiki/Local_Shared_Object


Friday, 3 April 2009

Garmin nüvi 200 Sat Nav device

This device has two memory chips hard wired onto the internal pcb, therefore the only regular means of accessing this memory is via the USB port. These sat nav devices will act as mass storage devices when connected via USB. I imaged one whilst connected to a Tableau USB write blocker. Please note that the time the device is switched on is recorded within current.gpx referred to below.

There are few human readable files most notably current.gpx. This file contains the users home location and user selected favourites along with the location of a number of Garmin offices. If a user saves a favourite from a location on a map the favourite will be entitled 001, 002 and so on.

There are a number of ways to investigate the contents of current.gpx. Effectively it is an xml formatted file which I use Microsoft XML Notepad 2007 to review. You can also use a utility such as EasyGPS or open the file with Google Earth.

To report the contents of current.gpx I use Microsoft Excel 2007. In order to do this successfully change the file extension to xml and use the xml data import facility (Data/ From Other Sources/ From XML Data Import) allowing Excel to create the schema. You will end up with a nicely formatted table.

Recently Found locations unfortunately do not appear to be saved within the user accessible memory.

The hidden service menu of the device can be accessed by turning on and then holding a finger on battery symbol on screen for 10 seconds. It is possible once in this menu to interface with the device via USB without it behaving as a mass storage device. Garmin USB drivers are required to do this. I am not sure whether this will be useful forensically at any stage.

UPDATE
A later post relating to a StreetPilot C510 may be some help.

References
http://en.wikipedia.org/wiki/GPS_eXchange_Format
Download nuvi 200 manual
http://www.techonline.com/showArticle.jhtml?articleID=210601020


Tuesday, 24 March 2009

Seagate Barracuda Firmware problems

As has been widely reported elsewhere many Seagate Barracuda hard disk drives have faulty firmware that causes them to effectively freeze.

In our lab we have 12 of the affected Seagate Barracuda 7200.11 1TB drives and 50 Seagate ES.2 1TB drives. Two of the 7200.11 have failed with symptoms that suggest that faulty firmware was the cause. I have flashed the firmware of all of the affected working drives and hope that I can look forward to a long fault free period.

One of the failed 7200.11 drives was one third of a Raid 0 array in one of our forensic workstations. This workstation needed an OS reinstallation (onto a separate single drive) and temporarily some data on the OS drive (normally backed up to the Raid) only existed on the Raid. During subsequent software installations a number of reboots were required triggering the firmware bug and a failed raid. Oh bother!

Normally data on our Raid 0 arrays is backed up but due to the aforementioned OS reinstall there was a small amount of data that lost. I therefore had to find a way to unfreeze the locked Seagate Barracuda. A considerable trawl of the internet led me to this post. Gradius2 details a fix involving a significant amount of down and dirty electronics and low level hard drive programming. Not having all the necessary adapters/serial cables etc led me to call Disk Labs. They quoted £1000 - that was a no go then. Luckily Hugh Morgan at Cy4or successfully repaired the drive broadly following Gradius2's fix for significantly less. I reintroduced the drive to the two other drives in the Raid 0 array and Bob's Your Uncle!


Sunday, 22 March 2009

Monetizing Helix

The forensics community has benefitted from the free Linux forensic distro Helix3 for some time. This distro was developed by Drew Fahey and distributed via e-fense.com (archived Helix 3 website). I suppose, like many free things, the issue of how to you support it and develop it when you are not making money from it became an issue for e-fense. I was under the impression that a revenue stream was available via Helix3 training courses (run by CSI Tech in the UK). I know that both Nick Furneaux and Jim Gordon were very busy with these courses, and having attended one myself, I thought they were a great success.

Anyhow it seems that training provision wasn't enough. Late 2008 e-fense invited e-fense helix forum members to make donations. Unsurprisingly take up wasn't that great. This resulted in a slightly hectoring email from e-fense announcing that Helix3 was now only available to those who subscribed for access to their forum. The subscription is around US$20 per month. So be it but as someone who has already paid circa US$1000 for a training course to use a product I cannot now download without subscription I am left feeling slightly disappointed.

Nothing stands still in this arena however. I have posted in the past about WinFE and some subsequent comments led me to a Grand Stream Dreams blog post written by Claus Valca. He referred to two free forensic Linux distros:

Perhaps one of these is the new helix?

It seems one or two others have commented on the same subject -it seems they are not planning to subscribe either.

I noticed a bit late in the day that there is an extensive thread over at Forensic Focus about this issue also.


Friday, 20 March 2009

Facebook Chat Forensics

Background
Facebook has a built in instant messaging facility which has grown in popularity along with the Facebook social networking site itself. Many cases involve potential grooming offences in which the use of instant messaging needs to be investigated.

The instant messaging facility creates a number of artefacts which are easily found and I know have been commentated on elsewhere. The purpose of this blog post is to suggest a methodology to automate the discovery and reporting of Facebook messages.

For those who have not looked at this area in detail yet messages are cached in small html files with a file name P_xxxxxxxx.htm (or .txt). These messages can be found in browser cache, unallocated clusters, pagefiles, system restore points, the MFT as resident data and possibly other places. It is possible for the messages to be cached within the main Facebook profile page (although I have never seen them there - the main facebook page does not seem to be cached that often).

An example of a message is shown below:

for (;;);{"t":"msg","c":"p_1572402994","ms":[{"type":"msg","msg":{"text":"Another Message","time":1237390150796,"clientTime":1237390150114,"msgID":"3078127486"},"from":212300220,"to":1123402994,"from_name":"Mark PPPPPP","to_name":"Richard XXXX","from_first_name":"Mark","to_first_name":"Richard"}]}

The bulk of the message is in fact formatted as JavaScript Object Notation normally referred to as JSON. The format is a text based and human readable way for representing data structures. The timestamps are 13 digit unix timestamps that include milliseconds - they can be divided by 1000 to get a standard unix timestamp.

Although keyword searches will find these messages they are difficult to review particularly if you are only interested in communication between selected parties. Having found relevant hits you then have to create a sweeping bookmark for each one. For these reasons I follow the following methodology.

Suggested Methodology

  • Create a Custom File Type within the Encase Case Processor File Finder module entitled Facebook Messages using the Header "text":" and the footer }]} making sure GREP is not selected.

  • Run the file finder with the Facebook Messages option selected.
  • When the file finder completes you will have a number of text files in your export directory (providing there are messages to be found).
  • These text files are in the form of the example above. They do not have Carriage Return and Line Feed characters at the end of the text. We need to remedy this by utilising a DOS command at the command prompt.
  • At the command prompt navigate to the directory containing your exported messages (please note Encase creates additional sub directories beneath your originally specified directory).
  • Then run the following command:
    FOR %c in (*.txt) DO (Echo.>>%~nc.txt)
    This command adds a Carriage Return and Line Feed to the end of the extracted message.
  • Next we want to concatenate the message text files into one file using the command at the DOS prompt: copy *.txt combined.txt
  • Alternatively create (or email me for) a batch file that executes these two commands direct from windows.
  • An additional file combined.txt will be created in your export directory.
  • Launch Microsoft Excel and instigate the Text Import Wizard specifying Delimited with the Delimiter being a comma and the text qualifier " .
    Put the data into your worksheet (or cell J3 of my pre-formatted worksheet).
  • All that's needed now is to tidy up the worksheet with some Excel formulas the full details of which can be found within my example pre-formatted worksheet. The formula to process the time values (which are Unix time stamps) is (RIGHT(K2,13))/1000/86400+25569 where K2 is the cell containing the source time data.
  • Perform a sanity check and remove obviously corrupt entries.
  • It can be seen below that after applying a data sort filter you can sort by time or user.

  • The spreadsheet also allows you to de-duplicate the found messages. In my recent case over half the recovered messages were duplicates. In Excel 2007 these duplicate (rows) are easily removed (Data/DataTools/Remove Duplicates). In Excel 2003 an add-in called The Duplicate Master will do this for you.

Further Thoughts
Non Encase users may be able to use an alternative file carver (e.g. Blade) to carve out the messages. I am sure that the header and footer could be refined a bit to reduce false positives, however for me the ratio of legitimate versus false positives is OK. UPDATE 22nd April 2009 - non encase users may wish to look at my more recent post.

I have the pre-formatted spreadsheet in template form. Please email me for a copy (with a brief explanation of who you are - thanks).

To further investigate the data you recover you may wish to check out http://www.facebook.com/profile.php?id=xxxxxxx. Just substitute the xxxxx with the User ID's you recovered.

Enscript Method
I have collaborated with Simon Key and now have an enscript to parse out JSON objects including messages. It outputs to a csv spreadsheet and in my tests parsed 160GB in about an hour. It might not be as tolerant of corrupt strings as the method detailed above. The script will only run in 6.13 or newer. I have a template that tidies up the formatting of the csv- email me if you want a copy.

References and thanks
http://coderrr.wordpress.com/2008/05/06/facebook-chat-api
http://video.google.com/videoplay?docid=6088074310102786759
http://json.org/
http://en.wikipedia.org/wiki/JSON
http://www.wilsonmar.com/datepgms.htm#UNIXStamp

Thanks to Glenn Siddall for sparking my interest and providing me with some notes of his research.
Thanks to Mark Payton for his assistance in researching this.



Thursday, 19 March 2009

The need for speed

We are lucky in our lab that our workstations are upgraded on a regular basis so once in use we don't often make many changes.

The most important bits of my current spec are as follows:

  1. Supermicro X7-DWA-N fitted into a Supermicro CSE-733TQ-645 chassis
  2. Two Intel Xeon X5482 processors
  3. 16GB DDR2 800 Ram
  4. Western Digital 300GB VelociRaptor10,000 rpm hard drive for the OS
  5. 3 x 1TB Samsung HE103UJ Spinpoint F1 hard drives in a RAID 0 array
  6. Microsoft XP 64 bit
  7. 256MB ATI FirePro V3700 GPU

At one time our primary forensic software Encase would max out the processor when carrying pretty much any process. With the advent of multi-core dual processors we aim to max out one core (which on my box is 13% cpu utilisation in task manager). As processors get faster and faster I have noticed that often the CPU core is not maxing out. Something else is slowing us down! We store our Encase evidence files on the Raid 0 array (and just before someone posts a comment about the lack of data resilience etc., the way our lab is set up all the data on my Raid 0 array is mirrored elsewhere). We do this for speed and capacity. When Encase (and most other forensic utilities for that matter) is processing it has a voracious appetite for data. Just look at the Read bytes value in Task Manager. The multi-core processors allow us to run other forensic programs (FTK, Netanalysis hstex etc etc) along with Encase, we can even run other instances of Encase, and because we can - we do. The net result of all these programs running is that they compete to read data from the Raid 0 array in my case (and from wherever you store yours in yours) - the net result once your data storage is maxed out is things slow down. It follows then that performance can be increased by having faster data storage.

One way to achieve this would be faster hard drives. We use sata hard drives for capacity reasons and to an extent cost. SAS hard drives are faster but don't provide the capacity. So as things stand three hard drives in a Raid 0 array was the best that could be done. I decided to see how I could make some improvement.

Currently the three hard drives (and the OS drive) connect to the Intel ESB2 raid controller integrated on the motherboard. Conventional wisdom would have it that by adding a fourth hard drive to the raid 0 array would speed things up.

HD Tach details an average sequential read speed of around 200 MB/s for a three drive array utilising the default stripe size (128kb) with NTFS formatted with the default cluster size.



Adding a fourth drive slowed the sequential read speed to around 180 MB/s.



I tested a variety of different stripe sizes and aligned the partitions but came to the conclusion that the Intel ESB2 controller just did not scale up to four drives very well. The arrays were created via the utility accessed via the controller bios during boot up. This utility is very basic and does not allow much configuration. Intel also provides a Windows utility called Intel Matrix Storage Console. When running this utility I found that by default Volume Write-back cache was disabled. Enabling it made a significant improvement.




Conventional wisdom has it that a hardware Raid controller would improve performance over the Intel ESB2 and in my testing this seems to be the case. I have used an Areca 1212 PCI-E raid card and achieved a sequential read speed of over 600 MB/s.




This array has four 1TB sata hard drives with a 64kb stripe, is partition aligned at 3072 sectors and has one NTFS volume with the default cluster size. Using Syncback to write to the array from our file server across a copper gigabit ethernet network produces some pretty impressive network utilization stats.