Lab 3: Getting Started with Simulations
Fire up the BasicProp simulator again.
Part 1: Learning AND in a 2-1 Network Architecture
- Use "Network→Configure Network" to bring up the configuration screen
- Make the network have only 2 layers: 2 input units (bottom) and one output unit (top).
- Use "Patterns→Load Patterns" to load in "AND.pat"
- Examine the patterns using the check box in the control panel. Make sure you know what the function being learned is.
- Train the network. This should be easy.
- Note the color of the weights drawn. Use "Weights→Show Weights" to see the weights using a Hinton diagram.
- Use the "Test one" button in the control panel to verify that the network has learned the function properly. Test all 4 patterns, by first selecting the pattern in the check box, then hitting "Test one".
- Reset the network and repeat a few times. Use the "Batch update" mode too: how does that differ?
- Use "Patterns→Show Patterns and Outputs" to examine the patterns and the solution you obtained. Note inputs may range from -1 to 1, but hidden units and outputs are restricted to the range (0,1). Why?
Learning OR and Attempting XOR
- Load OR.pat. Familiarize yourself with the patterns, using either the check box of the control panel, or using "Patterns→Show Patterns".
- Verify that this is no harder than AND to learn.
- Select "Utilities→3D Plotting", leave the choices of X, Y and Z as they are, and examine the function in yet another way.
- Load XOR.pat. Examine this function too.
- Verify that the 2-1 network can not learn XOR. Try hard!
- Add a hidden layer with two hidden units. Try again.
- Experiment by training with and without bias nodes, with and without batch learning, etc.
- Try learning with more, or less, hidden units. Add an extra hidden layer. What difference does that make?
- Examine your weights once the network as learned XOR. Repeat several times. How many qualitatively different solutions can you find?