- Construct a descision tree with good prediction accuracy for the iris dataset. (This is a motivational example and a bit vague on purpose - to make you think. The datset is small and easy, construct the decision tree by hand.) [Due: Thu 7 March 2019]
- Find a good classifier for the "dataVeryHard.arff" dataset. The best classification is the XOR of 5 attributes and achieves
a classification error around 5%. This is a computationally hard problem and may require several hours of computing time. Send your code that finds the 5 relevant attributes to auer@unileoben.ac.at, using "ML Assignement 2,
MatrNr " as subject line. [Due: Wed 27 March 2019] -
Anwer the questions in the Predictive Maintenance Exercise.
Send the answers and calculations for questions 3 and 4 to auer@unileoben.ac.at, using "ML Assignement 3,
MatrNr " as subject line. [Due: Wed 27 March 2019] - Find a good SVM classifier for the handwritten digit data.
Describe how you found the classifier and give an estimate for its accuracy.
Argue, why your estimate is likely to be correct.
Send your classifier and the estimated accuracy together with the corresponding argument to auer@unileoben.ac.at, using "ML Assignement 4,
MatrNr " as subject line. [Due: Tue 21 May 2019] - Find a good MLP classifier for the handwritten digit data.
Describe how you found the classifier and give an estimate for its accuracy.
Argue, why your estimate is likely to be correct.
Send your classifier and the estimated accuracy together with the corresponding argument to auer@unileoben.ac.at, using "ML Assignement 5,
MatrNr " as subject line. [Due: Tue 21 May 2019] - Find a good CNN classifier for the handwritten digit data.
Describe how you found the classifier and give an estimate for its accuracy.
Argue, why your estimate is likely to be correct.
Send your classifier and the estimated accuracy together with the corresponding argument to auer@unileoben.ac.at, using "ML Assignement 6,
MatrNr " as subject line. [Due: Tue 4 June 2019]

Data for learning disjunctions and parity with decision trees:

Handwritten digit data:

- alldigits.mat
- showImage.m MATLAB function to visualize the handwritten digits.
- Digit data in MATLAB image format: DLdata.mat

`patternnet`

generates a network with suitable default parameters.
The number of hidden units and the training function (for example `traingd`

, `traingda`

, `traingdx`

, `trainscg`

) can be chosen.
The network can be trained using the function `train`

, providing the inputs and the target outputs.
The split into train, validation, and test data can be controlled by the parameter `net.divideParam`

.
Often it is useful to modify also some other parameters of `net.trainParam`

.
After training the output of the network for a given input can be calculated using the function `sim`

.
`trainNetwork(X,Y,layers,options)`

.
In `layers`

you define the list of layers you want to use in your network.
See the corresponding documentation about how to specify layers.
The `options`

variable is best defined using the `trainingOptions`

function.
You need to specify the `solverName`

(the variant of gradient descent you want to use)
and you can define several other options, see the documentation.
An important one is `ValidationData`

(the cell array variant is the easiest one to use).