Forensics Walkthrough (DefCon CTF 2008 Qualifiers)
This category is always lots of fun. It required a good deal of attention to detail, creative thinking, and a wide knowledge base. Professional tools can some times be a serious help, especially when dealing with disk images and archives.
100: delete your FAT
The output of "file" and "hexdump" shows this to likely be a FAT disk image:
$ file forensics100-b564a4eca4e1ba6e5c4f626bdd1bcedd 
forensics100-b564a4eca4e1ba6e5c4f626bdd1bcedd: x86 boot sector
$ hexdump -C forensics100-b564a4eca4e1ba6e5c4f626bdd1bcedd | head -n 1
00000000  eb 3c 90 4d 53 44 4f 53  35 2e 30 00 02 02 02 00  |.<.MSDOS5.0.....|
After mounting it and examining the contents, we see a ton of tiny text files, all with various phrases in them:
$ sudo mount -o loop forensics100-b564a4eca4e1ba6e5c4f626bdd1bcedd /mnt/floppy
$ tree /mnt/floppy
/mnt/floppy
|-- Recycled
|   |-- desktop.ini
|   `-- info2
|-- System Volume Information
|   `-- _restore{78AFAA6E-2D47-43CE-90B8-D2663E0AEB92}
`-- ken
    `-- sho
        `-- to
            |-- AVPNFmgZbN
...
            `-- yxHvjLtvHO

6 directories, 91 files
$ cat /mnt/floppy/Recycled/info2
 :\ken\sho\to\XXXXXXXXXX���w��D:\ken\sho\to\KEyFFiLLeED:\ken\sho\to\XXXXXXXXXX ���y��D:\ken\sho\to\KeyYFiLLeE
$ cat /mnt/floppy/ken/sho/to/yxHvjLtvHO
wound around the axle bug
This was starting to look a lot like the 2006 Forensics 300 question. While that included compression, this question seemed to be tried to hide the deleted file contents by making it hard to find the needed phrase. We took a different approach: anything listed in "strings" that isn't in the ken/sho/to files was probably the key phrase:
$ strings -a forensics100-b564a4eca4e1ba6e5c4f626bdd1bcedd | while read line; \
    do grep -q "$line" /mnt/floppy/ken/sho/to/* || echo "$line"; done
MSDOS5.0
NO NAME    FAT16   3
|8N$}$
...
AVPNFM~1
CLSID={645FF040-5081-101B-9F08-00AA002F954E}
:kenshotoXXXXXXXXXX
D:kenshotoXXXXXXXXXX
kentucky ham
XIFXTC~1
...
The "kentucky ham" phrase sure stands out.
200: Compress your compression with some compression compresses
It all started innocently enough:
$ file forensics200-20c6d7dee480b31b28802f2c3b951313
forensics200-20c6d7dee480b31b28802f2c3b951313: LHarc 1.x/ARX archive data [lh0]
"Ah, an LHA archive. No problem, I can uncompress that."
$ sudo apt-get install lha
...
$ lha x ../forensics200-20c6d7dee480b31b28802f2c3b951313 # 1
key     - Melted   :  ooooo
$ file key
key: PPMD archive data
What the hell is that? Luckily, Debian and Ubuntu have a pile of packaged archivers:
$ apt-cache search ppmd
ppmd - fast archiver program with good compression ratio
$ sudo apt-get install ppmd
...
$ mv key key.ppmd
$ ppmd d key.ppmd # 2
Fast PPMII compressor for textual data, variant I, Jun 21 2006
           key:   9386 >   9156, 8.20 bpb, used:  0.1MB, speed: 915 KB/sec 
$ file key
data
Dang, up to us now.
$ hexdump -C key | head -n 8
00000000  44 43 54 c3 00 08 29 00  00 00 00 00 00 23 8c 00  |DCT...)......#..|
00000010  00 00 01 00 00 23 91 00  00 00 00 14 04 00 03 6b  |.....#.........k|
00000020  65 79 00 00 04 d7 3b 6f  12 01 00 04 d7 3b 6f 12  |ey....;o.....;o.|
00000030  00 23 91 1a 02 6b 65 79  00 00 b1 07 28 00 00 00  |.#...key....(...|
00000040  00 80 6d 23 00 00 bb 38  64 11 ee 81 6d 23 00 00  |..m#...8d...m#..|
00000050  60 ea 2b 00 22 0b 01 02  10 00 02 64 bd 6d 3b 48  |`.+."......d.m;H|
00000060  bd 6d 3b 48 00 00 00 00  00 00 00 00 00 00 00 00  |.m;H............|
00000070  00 00 00 00 00 00 6b 65  79 2e 61 72 6a 00 00 42  |......key.arj..B|
Hm. No idea what DCT is, but it seems to store file contents in the clear, since we can read both "key" and "key.arj". Maybe ARJ isn't picky about the state of file contents...
$ sudo apt-get install arj
...
$ mv key key.arj
$ arj x key.arj # 3, 4
ARJ32 v 3.10, Copyright (c) 1998-2004, ARJ Software Russia. [10 Dec 2007]

Processing archive: key.arj
Archive created: 2008-05-26 19:11:09, modified: 2008-05-26 19:11:09
Extracting key                         OK        
     1 file(s)
$ file key
key: 7-zip archive data, version 0.2
$ sudo apt-get install p7zip
...
$ mv key key.7z
$ p7zip -d key.7z # 5
p7zip Version 4.57 (locale=C,Utf16=off,HugeFiles=on,4 CPUs)

Processing archive: key.7z

Extracting  key

Everything is Ok

Size:       8771
Compressed: 8951
$ file key
key: bzip2 compressed data, block size = 900k
$ mv key key.bz2
$ bunzip2 key.bz2 # 6
$ file key
key: data
$ hexdump -C key | head -n 8
00000000  00 e9 55 43 4c ff 01 1a  00 00 00 01 2d 07 00 04  |..UCL.......-...|
00000010  00 00 00 00 20 46 00 00  20 46 1f 8b 08 08 bc 6d  |.... F.. F.....m|
00000020  3b 48 00 03 6b 65 79 00  15 97 89 7f d5 54 fe fe  |;H..key......T..|
00000030  cf c9 be 27 dd 6f f7 74  01 0a 14 08 58 a0 40 29  |...'.o.t....X.@)|
00000040  69 41 a8 80 18 10 b0 0a  23 61 11 0a b2 44 a8 5a  |iA......#a...D.Z|
00000050  64 f1 dc 7b 0b 14 28 18  14 b1 2a 3a 61 fd 56 40  |d..{..(...*:a.V@|
00000060  27 8c 8c 76 b4 3a b9 80  d0 41 d4 20 38 56 65 98  |'..v.:...A. 8Ve.|
00000070  b0 88 55 19 8d 23 62 9d  41 fd f5 f7 27 9c f3 fa  |..U..#b.A...'...|
UCL? Don't know that one yet.
$ apt-cache search ucl
libucl-dev - Portable compression library - development
libucl1 - Portable compression library - runtime
...
$ apt-cache showsrc libucl1 | grep ^Binary
Binary: libucl1, libucl-dev
$ sudo apt-get install libucl1 libucl-dev
...
$ dpkg -L libucl-dev libucl1 | grep bin 
$
Hmm, no binaries to actually use UCL. But looking more closely:
$ dpkg -L libucl-dev libucl1
...
/usr/share/doc/libucl-dev/examples/Makefile
/usr/share/doc/libucl-dev/examples/uclpack.c.gz
...
$ cp /usr/share/doc/libucl-dev/examples/Makefile /usr/share/doc/libucl-dev/examples/uclpack.c.gz /usr/share/doc/libucl-dev/examples/portab.h .
$ make uclpack
gzip -d uclpack.c.gz
gcc -O2   -c -o uclpack.o uclpack.c
gcc -lucl  uclpack.o   -o uclpack
$ mv key key.ucl
$ ./uclpack -d key.ucl key # 7

UCL data compression library (v1.03, Jul 20 2004).
Copyright (C) 1996-2004 Markus Franz Xaver Johannes Oberhumer
http://www.oberhumer.com/opensource/ucl/

uclpack: block-size is 262144 bytes
uclpack: decompressed 8296 into 8262 bytes
$ file key
key: gzip compressed data, was "key", from Unix, last modified: Mon May 26 19:11:08 2008
$ mv key key.gz
$ gunzip key.gz # 8
$ file key
key: compress'd data 16 bits
$ my key key.Z
$ uncompress key.Z # 9
$ file key
key: RAR archive data, v1d, os: Unix
$ sudo apt-get install rar
...
$ mv key key.rar
$ rar x key.rar # 10

RAR 3.71   Copyright (c) 1993-2007 Alexander Roshal   20 Sep 2007
Shareware version         Type RAR -? for help


Extracting from key.rar

Extracting  key                                                       OK 
All OK
$ file key
key: Microsoft Cabinet archive data, 6474 bytes, 1 file
$ sudo apt-get install cabextract
...
$ mv key key.cab
$ cabextract key.cab # 11
Extracting cabinet: key.cab
  extracting key

All done, no errors.
$ file key
key: rzip compressed data - version 2.1 (6316 bytes)
$ sudo apt-get install rzip
...
$ mv key key.rz
$ rzip -d key.rz # 12
$ file key
key: Squeezed (apple ][) data
Now that's going to take some more research. After digging around for a while, an Apple ][ unsqueezer was found.
$ wget http://www.apple2.org.za/mirrors/ftp.gno.org/unix.tools/usq.310.tar.Z
$ tar Zxf usq.310.tar.Z
$ (cd sciibin; make)
...
$ mv key key.sq
$ ./sciibin/usq key.sq # 13
$ file key
key: ARJ archive data, v11, slash-switched, original name: , os: Unix
$ mv key key.arj
$ arj x key.arj # 14
ARJ32 v 3.10, Copyright (c) 1998-2004, ARJ Software Russia. [10 Dec 2007]

Processing archive: key.arj
Archive created: 2008-05-26 19:11:08, modified: 2008-05-26 19:11:08
Extracting key                         OK        
     1 file(s)
$ file key
key: bzip2 compressed data, block size = 900k
$ mv key key.bz2
$ bunzip2 key.bz2 # 15
$ file key
key: compress'd data 16 bits
$ mv key key.Z
$ uncompress key.Z # 16
$ file key
key: 7-zip archive data, version 0.2
$ mv key key.7z
$ p7zip -d key.7z # 17

7-Zip (A) 4.57  Copyright (c) 1999-2007 Igor Pavlov  2007-12-06
p7zip Version 4.57 (locale=C,Utf16=off,HugeFiles=on,4 CPUs)

Processing archive: key.7z

Extracting  key

Everything is Ok

Size:       3636
Compressed: 3761
$ file key
key: Microsoft Cabinet archive data, 3636 bytes, 1 file
$ mv key key.cab
$ cabextract key.cab # 18
Extracting cabinet: key.cab
  extracting key

All done, no errors.
$ file key
key: RAR archive data, v1d, os: Unix
$ mv key key.rar
$ rar x key.rar # 19

RAR 3.71   Copyright (c) 1993-2007 Alexander Roshal   20 Sep 2007
Shareware version         Type RAR -? for help


Extracting from key.rar

Extracting  key                                                       OK 
All OK
$ file key
key: data
$ hexdump -C key | head -n 1
00000000  00 e9 55 43 4c ff 01 1a  00 00 00 01 2d 07 00 04  |..UCL.......-...|
$ mv key key.ucl
$ ./uclpack -d key.ucl key # 20

UCL data compression library (v1.03, Jul 20 2004).
Copyright (C) 1996-2004 Markus Franz Xaver Johannes Oberhumer
http://www.oberhumer.com/opensource/ucl/

uclpack: block-size is 262144 bytes
uclpack: decompressed 3501 into 3467 bytes
$ file key
key: rzip compressed data - version 2.1 (3381 bytes)
$ mv key key.rz
$ rzip -d key.rz # 21 (half way done!)
$ file key
key: ARC archive data, uncompressed
$ sudo apt-get install arc
...
$ mv key key.arc
$ arc x key.arc # 22
Extracting file: key
$ file key
key: data
$ hexdump -C key | head -n 8
00000000  44 43 54 c3 00 08 29 00  00 00 00 00 00 0c de 00  |DCT...).........|
00000010  00 00 01 00 00 0c e3 00  00 00 00 14 04 00 03 6b  |...............k|
00000020  65 79 00 00 04 8d 15 52  79 01 00 04 8d 15 52 79  |ey.....Ry.....Ry|
00000030  00 0c e3 1c 5e 2d 6c 68  30 2d bf 0c 00 00 ac 0c  |....^-lh0-......|
00000040  00 00 63 11 bb 38 20 01  03 6b 65 79 cd 82 55 05  |..c..8 ..key..U.|
00000050  00 50 a4 81 07 00 51 00  00 00 00 07 00 54 bb 6d  |.P....Q......T.m|
00000060  3b 48 00 00 1f 8b 08 08  bb 6d 3b 48 00 03 6b 65  |;H.......m;H..ke|
00000070  79 00 0d d3 07 3b 15 0c  1b 00 e0 c7 28 92 91 bc  |y....;......(...|
Well, LHa doesn't seem as easy to skip DCT headers as ARJ does, so we can just compare where LHa headers start, and trim off the DCT headers instead:
$ hexdump -C forensics200-20c6d7dee480b31b28802f2c3b951313 | head -n 1
00000000  1c 76 2d 6c 68 30 2d d0  24 00 00 bd 24 00 00 64  |.v-lh0-.$...$..d|
$ dd if=key of=key.lha bs=1 skip=51 # 23
3299+0 records in
3299+0 records out
3299 bytes (3.3 kB) copied, 0.00964225 s, 342 kB/s
$ rm key
$ lha x key.lha # 24
key     - Melted   :  oo
$ file key
key: gzip compressed data, was "key", from Unix, last modified: Mon May 26 19:11:07 2008
$ mv key key.gz
$ gunzip key.gz # 25
$ file key
key: Squeezed (apple ][) data
$ mv key key.sq
$ ./sciibin/usq key.sq # 26
$ file key
key: PPMD archive data
$ mv key key.ppmd
$ ppmd d key.ppmd # 27
Fast PPMII compressor for textual data, variant I, Jun 21 2006
           key:   2324 >   2239, 8.30 bpb, used:  0.0MB, speed: 2239 KB/sec    
$ file key
key: ARJ archive data, v11, slash-switched, original name: , os: Unix
$ mv key key.arj
$ arj x key.arj # 28
ARJ32 v 3.10, Copyright (c) 1998-2004, ARJ Software Russia. [10 Dec 2007]

Processing archive: key.arj
Archive created: 2008-05-26 19:11:07, modified: 2008-05-26 19:11:07
Extracting key                         OK        
     1 file(s)
$ file key
key: rzip compressed data - version 2.1 (2042 bytes)
$ mv key key.rz
$ rzip -d key.rz # 29
$ file key
key: 7-zip archive data, version 0.2
$ mv key key.7z
$ p7zip -d key.7z # 30

7-Zip (A) 4.57  Copyright (c) 1999-2007 Igor Pavlov  2007-12-06
p7zip Version 4.57 (locale=C,Utf16=off,HugeFiles=on,4 CPUs)

Processing archive: key.7z

Extracting  key

Everything is Ok

Size:       1961
Compressed: 2042
$ file key
key: compress'd data 16 bits
$ mv key key.Z
$ uncompress key.Z # 31
$ file key
key: LHarc 1.x/ARX archive data [lh0]
$ mv key key.lha
$ lha x key.lha # 32
key     - Melted   :  o
$ file key
key: ARC archive data, packed
$ mv key key.arc
$ arc x key.arc # 33
Extracting file: key
$ file key
key: data
$ hexdump -C key | head -n 4
00000000  44 43 54 c3 00 08 29 00  00 00 00 00 00 05 a3 00  |DCT...).........|
00000010  00 00 01 00 00 05 a8 00  00 00 00 14 04 00 03 6b  |...............k|
00000020  65 79 00 00 04 3a 00 c2  0a 01 00 04 3a 00 c2 0a  |ey...:......:...|
00000030  00 05 a8 52 61 72 21 1a  07 00 cf 90 73 00 00 0d  |...Rar!.....s...|
$ dd if=key of=key.rar bs=1 skip=51 # 34
$ rm key
$ rar x key.rar # 35

RAR 3.71   Copyright (c) 1993-2007 Alexander Roshal   20 Sep 2007
Shareware version         Type RAR -? for help


Extracting from key.rar

Extracting  key                                                       OK 
All OK
$ file key
key: data
$ hexdump -C key | head -n 1
00000000  00 e9 55 43 4c ff 01 1a  00 00 00 01 2d 07 00 04  |..UCL.......-...|
$ mv key key.ucl
$ ./uclpack -d key.ucl key # 36

UCL data compression library (v1.03, Jul 20 2004).
Copyright (C) 1996-2004 Markus Franz Xaver Johannes Oberhumer
http://www.oberhumer.com/opensource/ucl/

uclpack: block-size is 262144 bytes
uclpack: decompressed 1535 into 1501 bytes
$ file key
key: CDC Codec archive data
Stumped again. Searching around for CDC we found reference to DQT, which seemed to dead-end into VMS and IBM tape backups. Finally in some random Google hit, we see something referring to it as "squeezed" data, so we try our unsqueezer on it:
$ mv key key.sq
$ ./sciibin/usq key.sq # 37
$ file key
key: bzip2 compressed data, block size = 900k
$ mv key key.bz2
$ bunzip2 key.bz2 # 38
$ file key
key: Microsoft Cabinet archive data, 485 bytes, 1 file
$ mv key key.cab
$ cabextract key.cab # 39
Extracting cabinet: key.cab
  extracting key

All done, no errors.
$ file key
key: gzip compressed data, was "key", from Unix, last modified: Mon May 26 19:11:07 2008
$ mv key key.gz
$ gunzip key.gz # 40
$ file key
key: PPMD archive data
$ mv key key.ppmd
$ ppmd d key.ppmd # 41
Fast PPMII compressor for textual data, variant I, Jun 21 2006
           key:    367 >   3472, 0.85 bpb, used:  0.0MB, speed: 3472 KB/sec    
$ file key
key: ASCII text
$ cat key # 42!
#     #                                             
#     #  ####  #    #    #    #   ##   #    # #   # 
#     # #    # #    #    ##  ##  #  #  ##   #  # #  
####### #    # #    #    # ## # #    # # #  #   #   
#     # #    # # ## #    #    # ###### #  # #   #   
#     # #    # ##  ##    #    # #    # #   ##   #   
#     #  ####  #    #    #    # #    # #    #   #   
                                                    
                                                                               
 ####  #      #  ####  #    #  ####     #####   ####  ######  ####     # ##### 
#    # #      # #    # #   #  #         #    # #    # #      #         #   #   
#      #      # #      ####    ####     #    # #    # #####   ####     #   #   
#      #      # #      #  #        #    #    # #    # #           #    #   #   
#    # #      # #    # #   #  #    #    #    # #    # #      #    #    #   #   
 ####  ###### #  ####  #    #  ####     #####   ####  ######  ####     #   #   
                                                                               
                                                                  
#####   ##   #    # ######    #####  ####      ####  ###### ##### 
  #    #  #  #   #  #           #   #    #    #    # #        #   
  #   #    # ####   #####       #   #    #    #      #####    #   
  #   ###### #  #   #           #   #    #    #  ### #        #   
  #   #    # #   #  #           #   #    #    #    # #        #   
  #   #    # #    # ######      #    ####      ####  ######   #   
                                                                  
                                    
#####  ####     ##### #    # ###### 
  #   #    #      #   #    # #      
  #   #    #      #   ###### #####  
  #   #    #      #   #    # #      
  #   #    #      #   #    # #      
  #    ####       #   #    # ###### 
                                    
                                                          
 ####  ###### #    # ##### ###### #####      ####  ###### 
#    # #      ##   #   #   #      #    #    #    # #      
#      #####  # #  #   #   #####  #    #    #    # #####  
#      #      #  # #   #   #      #####     #    # #      
#    # #      #   ##   #   #      #   #     #    # #      
 ####  ###### #    #   #   ###### #    #     ####  #      
                                                          
                      
##### #    # #  ####  
  #   #    # # #      
  #   ###### #  ####  
  #   #    # #      # 
  #   #    # # #    # 
  #   #    # #  ####  
                      
                                                          
######  ####  #####  ###### #    #  ####  #  ####   ####  
#      #    # #    # #      ##   # #      # #    # #      
#####  #    # #    # #####  # #  #  ####  # #       ####  
#      #    # #####  #      #  # #      # # #           # 
#      #    # #   #  #      #   ## #    # # #    # #    # 
#       ####  #    # ###### #    #  ####  #  ####   ####  
                                                          
                                                               
 ####  #    #   ##   #      #      ###### #    #  ####  ###### 
#    # #    #  #  #  #      #      #      ##   # #    # #      
#      ###### #    # #      #      #####  # #  # #      #####  
#      #    # ###### #      #      #      #  # # #  ### #      
#    # #    # #    # #      #      #      #   ## #    # #      
 ####  #    # #    # ###### ###### ###### #    #  ####  ###### 
And so, this becomes the first question to actually follow the Jeopardy-style "Give the question to this answer" rules.
300: Are you paying attention?
$ file forensics300-75efe5f1166aee0cb57060030e5f0325
forensics300-75efe5f1166aee0cb57060030e5f0325: Zoo archive data, v2.10, modify: v2.0+, extract: v1.0+
$ sudo apt-get install zoo
...
$ mv forensics300-75efe5f1166aee0cb57060030e5f0325 forensics300-75efe5f1166aee0cb57060030e5f0325.zoo
$ zoo l forensics300-75efe5f1166aee0cb57060030e5f0325.zoo

Archive forensics300-75efe5f1166aee0cb57060030e5f0325.zoo:
Length    CF  Size Now  Date      Time
--------  --- --------  --------- --------
   38145   0%    38145  21 May 08 21:58:28-2   adrianna.jpg
...
  210471   0%   210471  21 May 08 21:58:28-2   zhang.jpg
--------  --- --------  --------- --------
 2842401   0%  2842401    34 files
------------
There is 1 deleted file.
If you didn't notice the "There is 1 deleted file" warning, this turned into a massive red-herring while you examined the jpg images for hours, trying to find hidden meaning. If you did happen to see it, the answer was quick:
$ mkdir foo; cd foo
$ zoo xd ../forensics300-75efe5f1166aee0cb57060030e5f0325.zoo
Zoo:                 -- extracted
Zoo:  adrianna.jpg   -- extracted
...
Zoo:  zhang.jpg      -- extracted
$ rm *.jpg
$ mv * key
$ file key
key: Vim swap file, version 7.1
People familiar with VIM will realize this is a VIM back-up file, and you can restore from it normally, or just use "strings" on it:
$ strings key
b0VIM 7.1
root
qualsdev.kenshoto.com
~root/checkouts/ctf08/quals/forensics/zoomg/images/key.txt
3210#"! 
roar! what a hottie!
400: NTFSWTF?
$ file forensics400-75658daf9d7f77630d71424c83ee5753
forensics400-75658daf9d7f77630d71424c83ee5753: bzip2 compressed data, block size = 900k
$ mv forensics400-75658daf9d7f77630d71424c83ee5753 forensics400-75658daf9d7f77630d71424c83ee5753.bz2
$ bunzip forensics400-75658daf9d7f77630d71424c83ee5753.bz2
$ file forensics400-75658daf9d7f77630d71424c83ee5753
forensics400-75658daf9d7f77630d71424c83ee5753: x86 boot sector
$ hexdump -C forensics400-75658daf9d7f77630d71424c83ee5753 | head -n 1
00000000  eb 52 90 4e 54 46 53 20  20 20 20 00 02 01 00 00  |.R.NTFS    .....|
This is an NTFS image, so we can use SleuthKit's Autopsy tool to examine it.
$ sudo apt-get install autospy
...
$ mkdir /tmp/woot
$ cp forensics400-75658daf9d7f77630d71424c83ee5753 /tmp/woot
$ autopsy -d /tmp/woot &
Then we can browse to http://localhost:9999/autopsy and open a case, open the image, select it as a partition and start browsing the file system.
Within the File Analysis interface of Autopsy, we traverse the directories similar to those in Forensic 100 and see a file, KkEYyfiIle, in c:\ken\sho\to. The timestamps and file sizes are zeroed out which seems suspicious and the pointer of 33 to the MFT entry for the file shows up as invalid when clicked.
Searching for "KkEYyfiIle.txt" in the Keyword Search turns up 4 hits and none of them appear interesting until we get to the fourth one. Experience has taught us that Kenshoto likes to base64 encode things and the last bit of cluster 85725 looks like some base64 encoded data. Clicking the Next button shows the last half of the interesting string. (Certainly base64 now, based on character set and trailing "=".)
Instead of copying/pasting and removing spaces, we switch to Data Unit view where we put in 85725 for cluster and 2 for the number of clusters we want. When we click View, we see the default ASCII view of the clusters. Scrolling over we see the string we are interested in "VEhFIEFOU1dFUiBJUzogRm8ocmVuc2ljcykgc2hlZXppZSBteSB3ZWVuaWU=", which we decode.
$ echo 'VEhFIEFOU1dFUiBJUzogRm8ocmVuc2ljcykgc2hlZXppZSBteSB3ZWVuaWU='|openssl enc -d -base64 ; echo
THE ANSWER IS: Fo(rensics) sheezie my weenie
500: Alpha for the lose
During the Quals, 1@stPlace didn't finish Forensics 500, so it is with great thanks that we present the Forensics 500 walk-through from Brandon Enright, which uses his PNG processing tool:
The problem was this image:
PNG allows for text chunks so I spent about 10 hours examining every byte of the PNG for strangeness. Once I could account for every byte of the original file and what role it played in the image I was satisfied that the message had to be in the image or the way the image was encoded.
I noticed early on that the alpha mask was hiding random looking noise:
My original thought for the purpose of the random noise was to keep the image size large. The row filters are very poor at compressing noise and zlib is even worse. It then occurred to me that there is one row filter that would _never_ be used if each of your rows had a lot of random noise: AVG. This made me think that the row filter bytes themselves contained the message. I constructed a matrix of them:
my @filters = qw/
1 2 1 4 0 0 1 1 4 4 0 1 0 4 0 1 2 4 1 2 1 4 1 4 1
1 1 1 1 1 1 1 1 4 4 1 2 2 1 2 1 2 2 4 4 1 4 2 1 4
1 2 2 4 1 4 4 1 2 4 4 4 1 4 4 2 4 2 1 4 2 4 4 2 4
2 2 4 4 1 4 1 1 1 2 2 4 2 2 4 4 2 4 2 4 1 2 4 1 2
4 4 1 4 0 2 4 1 1 1 4 4 4 1 4 4 4 2 4 2 1 2 1 2 0
1 2 2 2 4 4 2 1 4 2 4 4 4 2 2 4 1 2 2 2 2 4 2 2 2
2 2 2 2 2 4 4 2 2 2 4 2 2 2 2 2 2 2 1 1 4 2 1 2 1
4 1 1 4 1 1 4 4 2 4 1 2 4 0 1 2 1 1 1 1 2 1 0 1 1
4 4 1 1 1 1 4 1 1 4 1 2 2 2 4 2 1 2 2 4 2 4 2 4 2
2 2 2 2 4 4 4 2 2 4 4 4 2 4 2 2 2 4 4 4 4 4 4 2 4
4 4 2 2 4 1 4 2 2 4 4 4 4 4 4 4 4 2 4 4 4 2 2 4 2
2 4 2 2 4 2 4 2 2 4 2 4 4 4 4 4 4 2 4 4 4 4 4 4 4
4 4 2 2 4 4 4 4 2 2 2 4 2 2 2 2 2 4 4 2 4 2 2 2 4
2 2 4 2 4 2 2 4 2 2 2 4 4 1 1 4 4 1 4 1 1 4 4 4 4
2 2 2 2 1 4 4 1 2 4 2 4 2 1 1 4 2 0 2 0 1 0 4 0 0
1 0 4 2 1 1 2 1 1 4 4 4 1 2 2 2 1 2 1 1 1 4 1 1 4
4 1 1 1 1 1 2 2 4 1 4 2 4 0 1 1 2 1 4 1 1 4 1 4 1
1 4 1 4 4 1 2 2 2 2 2 2 2 4 2 1 4 2 1 2 2 2 2 4 2
2 1 4 2 2 2 2 4 1 4 1 1 1 4 4 4 4 4 4 4 4 2 2 0 4
2 2 4 4 0 4 4 0 2 4 2 4 2 4 1 2 1 2 2 1 2 4 1 1 2
2 2 0 1 4 4 4 2 4 2 4 2 2 2 4 2 1 2 2 2 1 4 2 4 2
4 2 4 2 4 2 1 4 1 2 2 1 4 2 4 2 0 2 2 0 0 4 1 0 2
0 2 2 0 0 1 0 2 1 2 
/;
I tried a shitload of different operations on this matrix including treating each filter as a 2-bit number and trying to construct ASCII that way.
I stepped back from this line of reasoning and realized that the row filters matched very nicely with the thickness of the green of each Kenshoto character in the image. I nulled out all the row filters and got this image:
I noticed two things after doing this: the row filters did not contain a message, and, the color saturation of the random noise in the original image had been bumped up.
This lead me to look into the random noise itself.
The first thing I did was make a new image that skipped all the alpha != 0 places:
I then turned the random data into a byte stream. I tried every conceivable operation on this byte stream. One operation (XOR chain) was slightly more interesting than the rest:
XOR made the image look _more_ random because it destroyed the saturation boost.
This made me decide to run some statistics on the data. I was operating on a per-channel basis and found nothing. I then decided to move to a intra-channel basis. Most of my statistics showed no correlation between channels:
This made me decide that if there was information in the noise, it would be pixel based. I computed the expected pixel collision count (birthday paradox) to be 418.5 duplicate pixels. I found the actual collision rate to be 816 duplicates.
At first I was ready to explain this increase in duplicates to be the result of the saturation boost in of the noise. I decided to plot the duplicates anyways:
One pixel was repeated 20 times. This is statistically nearly impossible. I wrote a program to highlight just those pixels and produced this image:
My first thought for the spacing was Morse code but that led nowhere. I then decided to look at the spacing numerically:
Y coord: 237; diff: 237; char: -
Y coord: 253; diff: 16 ; char: P
Y coord: 268; diff: 15 ; char: O
Y coord: 284; diff: 16 ; char: P
Y coord: 306; diff: 22 ; char: V
Y coord: 326; diff: 20 ; char: T
Y coord: 328; diff: 2  ; char: B
Y coord: 343; diff: 15 ; char: O
Y coord: 355; diff: 12 ; char: L
Y coord: 369; diff: 14 ; char: N
Y coord: 395; diff: 26 ; char: Z
Y coord: 404; diff: 9  ; char: I
Y coord: 414; diff: 10 ; char: J
Y coord: 434; diff: 20 ; char: T
Y coord: 455; diff: 21 ; char: U
Y coord: 471; diff: 16 ; char: P
Y coord: 479; diff: 8  ; char: H
Y coord: 498; diff: 19 ; char: S
Y coord: 500; diff: 2  ; char: B
Y coord: 514; diff: 14 ; char: N
The diff is the space between. I noticed it was in 1-26 so I decided to map it to A-Z as a char.
That didn't produce anything too interesting.
I decided to do ROT-N on the output though and found that ROT-25 produced this:
ONOUSANKMYHISTOGRAM
Problem solved.
A post-quals realization is that if "A" is 1, then counting the number of pixels between each marked pixel gets you immediately to the key phrase without the ROTation.

ctf 2008 quals