Lab 4 - Midterm Solutions and Using SNNS

This lab is worth 10 points.
Due: Friday February 8, 2008 at 5pm

Part 1: Midterm Solutions

The midterms will be returned at the start of the lab. The solutions will be discussed.

Part 2: Using SNNS

The Stuttgart Neural Network Simulator (SNNS) is a neural network toolkit provided by the University of Stuttgart and the University of Tübingen. It provides both a simple graphical user interface via an X-Windows application and a command line interface. Since we are in 414 and do not have X-Windows capabilities, we will be using the command line interface.

SNNS has been installed on Helios in /ai/snns. The executables are located in /ai/snns/bin. A PDF of the user manual is located in /ai/snns/doc. You will have to download it to your lab machine to view it since the lab machines do not have X-Windows capabilities. A large set of examples is provided in /ai/snns/examples.

There are four primary types of example files: the network definition (.net), the training data (.pat), the configuration for the graphical tool (.cfg) and the script for the command line tool (.bat). We will be looking at the encoder example in particular for this lab. The encoder neural network is what is known as an 8-3-8 network because it has 8 input nodes, 3 hidden nodes and 8 output nodes. The expected input for the network is an 8-bit binary string with one 1 bit and seven 0 bits. The expected output is identical to the input. The challenge for the network is feeding this data through just three hidden nodes, so it cannot do a 1-to-1 mapping from input to output. The following image summarizes the 8-3-8 problem:

Visualization of Encoder

To test this network, log into Helios and create a subdirectory for this lab. Copy the following files to your subdirectory:

Run the command line version of SNNS with the following command:
/ai/snns/bin/batchman -f encoder.bat
It should take about 10-20 training cycles before the training is done. The results are written to a file called encoder.res. In this file, there is one 3-line entry per training example. It shows the training data, the output pattern (should match the training data) and the errors for each output bit. There will be 8 such entries in the file since we have 8 training patterns. The trained network is written to encoder.trained.net. The actual learned information is stored in the "Unit definition section" and the "Connection definition section". The weights are listed in the "Connection definition section" in edge-list format where the target column is the destination node and the source column contains a list of source:weight pairs. For example, if you are looking for the weight of the edge from i to j, you would look first for j in the target column, then for i:weight in the source column of j's row.

Lab Writeup

Write a paragraph on what you did to run SNNS for the encoder neural network and what you saw in the results. Include the summed square error (SSE) and number of cycles (CYCLES) that batchman produced when you ran it.