Security Basics mailing list archives

Re: Hard Drive Forensics Question


From: "Morgan Reed" <morgan.s.reed () gmail com>
Date: Tue, 7 Oct 2008 08:45:49 +1100

On Sat, Oct 4, 2008 at 3:17 AM, Razi Shaban <razishaban () gmail com> wrote:
Perhaps it would be a good idea to copy+paste+delete a few very large
random files on there (99.5% occupying the drive) a few times, just in
case. If he feels the random data files would appear suspicious, copy
the largest files on the drive a few times. This will help to make it
more difficult — if anything, quite difficult — to recover any data
that may have been on the hard drive.

A single large file (or a number of large files) not consuming the
entirety of the unallocated space will not necessarily help. File
system drivers will generally try very hard to avoid fragmentation
(not sure about the disk usage strategies in HFS+ though), this means
that you will likely miss the blocks in the middle of the pre-existing
data which are more likely to contain fragments of the files in
question (although given their location on disk are also more likely
to have been overwritten in the intervening 6 months anyway, hence my
original remark).

You are better off with a large number of files the same size as the
allocation units on the disk (i.e. if you have 32KB blocks make the
files 32KB files), or to fill the entirety of the free space with
random data.

# dd if=/dev/urandom of=/randomblob bs=32k && rm /randomblob

If you were to do the same with /dev/zero you would still achieve the
desired result, however all the vacant space being filled with zeros
prior to a forensic exam would tend to be suspicious to the
investigator (but then so would completely random data (but to a much
lesser degree), I'd generally expect to see things like file headers
floating about in unallocated space).


Current thread: