How To

This page is devoted to helpful tips and instructional videos on how to use free utilities, and commercial software for eForensic Collection, eDiscovery, and Litigation Review, subject to this Disclaimer.

Java App Extract LDAP Information from Google Apps Directory

Tool to Extract LDAP info from Google Apps Directory.

Get the app here:

It needs the following 5 arguments in the given order.

1. Data needed – “a” for both org unit data and user data “u” for only user data and “o” for only org unit data.

2. Email – Email Id of the admin user for the domain

3. Password – Password of the admin user for the domain

4. Domain name – Name of the domain

5. Directory name – Directory where the files are to be stored

Two files, Users.csv and OrgUnits.csv will be created. If those files already exist in the given directory, they will be deleted. If deletion is not successful, the tool will prompt users to delete them manually and continue.

Eg: java -jar AppSyncTool.jar a abcdefg “f:/new folder/”

Note: For this tool to run, the API access needs to be enabled in the admin console.

How to Install and Run on Windows:

1) Make sure java is installed

2) Run cmd.exe from start menu

3) Set the path of where jave.exe is:

path=%path%;C:\Program Files (x86)\Java\jre7\bin

4) Run program as in step 5 above

eForensic Collection (with logging)

This section explains a free self help utility that can help keep a proper chain of custody when clients choose to do a self collection by keeping forensically sound log files.

robocopy (comes free with Windows 7) – with GUI (similar in features to robocopy, but not based on it)

See our blog post here for a free utility that inventories all of the drives on a Windows machine


Quick Tutorial video on YouTube

Note that SafeCopy from PinPoint Labs uses the robocopy engine to preserve directory and file name time stamps, and metadata (author, modification date, security and auditing attributes).

Simply cut and paste from below into a command prompt window (cmd.exe) which comes with Windows 7.

Perform a complete copy with defensible logs

The example below copies all of the files from the <d drive>, including recursive sub directories, and will list the results into the current director in filecopy.txt.   To enhance performance, the command below kicks off 128 simultaneous copying threads.  The /TEE option shows the output to the console in addition to the logging performed.

robocopy e: . /COPY:DATD /E /V /TS /FP /BYTES  /ETA   /R:1000000 /W:30 /LOG:filecopy.txt /MT:128 /TEE

Make a File Listing Only

This option will make a listing of all files, but will not copy them.

robocopy e: . /COPY:DATD /E /L /V /TS /FP /BYTES  /ETA   /R:1000000 /W:30 /LOG:filecopy.txt /MT:128

Take Inventory of Directories

This is helpful for documenting workstations/custodians that belong to data backed up or on a file server.  The output can easily be put into a spreadsheet for making comments and assigning the custodian who was in charge of a particular source of data.

The options below takes a listing from the <d drive> and omit file names being listed, and provide a concise summary of the most important directory (5 levels deep which can be changed).   Output is put into the current directory in directorylist.txt.

robocopy d: . * /NFL /L /S /E /COPY:DAT /LEV:5 /R:1000000 /W:30 /LOG:directorylist.txt

Robocopy with hash verification

One of the limitations of Robocopy is that is uses CRC (cyclical redundancy checking) for file integrity verification as opposed to MD5 or SHA-1 hash values that are logged and forensically stored in logs.   To overcome this problem, there is another utility from Microsoft called FCIV (File Check Integrity Verification).  There is also a BETA version of a Windows Powershell script which combines the functionality of both Robocopy and FCIV into a single script, and you can find the source code for that script here

For a simple batch file, you can use a script like this:


Microsoft File Integrity Verification

@echo off

:: ##### Set these variables before running the batch file ###############
set Project_Alias=AB561
set Asset_Tag=KO_334455
set Filter=IncludeFilter.txt
set Source_Path=\serverLTshareTest
set Dest_Path=F:AB561
set Log_Name=Source_Log.txt
:: The log file is created first and the source information is entered.
:: After the SHA1 hash values are calculated they are appended to the log.
:: IncludeFilter.txt contains the file extentions that will be used as a filter for the source.
echo: ===============================
echo: Source Hash – New Log Entry
echo: ===============================
echo: Source – Collection Specifics:
echo: Project Alias ……….%Project_Alias%
echo: Asset Tag …………..%Asset_Tag%
echo: Filter ……………..%Filter%
echo: Source_Path …………%Source_Path%
echo: System Date………….%date% %time%
echo: Computer Name………..%computername%
echo: User Logged On……….%USERNAME%
echo: User Domain …………%USERDOMAIN%
echo. SHA1 Hash Values Source Path/File Name
echo: ======================================= ===========================

) >> %Dest_Path%Logs%Log_Name%
:: Example robocopy /njs /njh /ndl /ns /nc /l C:temp c:dummy
:: C:temp is the source and c:dummy is a dummy destination call so RoboCopy will not error out
echo Hash using the following values:
echo: Project Alias ……….%Project_Alias%
echo: Asset Tag …………..%Asset_Tag%
echo: Filter ……………..%Filter%
echo: Source Path …………%Source_Path%
echo: Destination Path……..%Dest_Path%Logs%Log_Name%
echo If this is correct:
%Dest_Path%Scriptsrobocopy /njs /njh /ndl /ns /nc /l %Source_Path% c:dummy
echo Calculating hash values for %Source_Path%
for /f “usebackq tokens=* delims= ” %%a in (%Filter%) do (
fciv.exe -add “%Source_Path%” -type “%%a” -r -sha1 | find “” >> %Dest_Path%Logs%Log_Name%
echo —–The Calculation IS COMPLETE——–
echo %Dest_Path%LogsSource_Log.txt”


eForensic PST Processing (with logging)

Free utility from Library of Congress to process PST’s

This utility is good for extracting email messages and attachments into individual files, instead of doing it manually.   It also keeps a log file which notes any errors or problems extracting each email message from the .PST file.   The metadata and email body are stored in a separate file from the attachments.   We are working on some enhancements to this utility to make it easier to use, and will update this section in Q1 2013 with these changes.



DtSearch Desktop ($300)

This program runs on your desktop, and is an extremely high performance indexing and search utility at a very reasonable price.   It is great for ingesting large amounts of data (hundreds of Gigabytes even Terabytes), and making it searchable.   This is useful for generating reports that show the number of responsive files and the search hits for each when testing a keyword search plan.  It is able to show the number of hits on a per document basis, and on an individual email basis for even .PST files.  There is a limitation for archives (.zip and .rar) files in that the hits show up as responsive to the entire archive, as opposed to the individual files in it.   Also, there is no de-duplication functionality which can be considered a positive in that it significantly shortens the time required to perform indexing and searching.

Free Near Duplicate Viewer

Windiff.exe  comes free with Windows 7.  Organize your near duplicates in a single directory and let this utility highlight the differences.