Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

CS 760 Machine Learning Final Exam, Exams of Machine Learning

The final exam for the CS 760 Machine Learning course at the University of Wisconsin Madison in Spring 2017. The exam consists of two problems related to decision trees and instance-based learning, and neural networks. Students are allowed one hour and 15 minutes to complete the exam and are permitted to use handwritten notes and a calculator. The instructions specify that students must write their answers in the space provided, show calculations legibly, and write all final answers below the questions. Scratch work should only be done on the backs of the sheets provided. The exam is worth a total of 100 points.

Typology: Exams

2016/2017

Uploaded on 05/11/2023

shokha
shokha 🇮🇳

4.5

(13)

234 documents

1 / 11

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Page 1
University of Wisconsin Madison
Department of Computer Sciences
CS 760 Machine Learning
Spring 2017
Final Examination
Duration: 1 hour 15 minutes
One set of handwritten notes and calculator allowed.
Instructions:
· Write your answers in the space provided.
· Show your calculations LEGIBLY.
·If you feel that a question is not fully specified, state any assumptions you need to make in order to
solve the problem.
· Use the backs of the sheets for scratch work ONLY.
·Write all the final answers BELOW the questions. Answers written in the scratch sheets will NOT
be considered.
Name:
UW ID:
Problem
Score
Max Score
1
2
3
4
Total
20
30
20
30
100
pf3
pf4
pf5
pf8
pf9
pfa

Partial preview of the text

Download CS 760 Machine Learning Final Exam and more Exams Machine Learning in PDF only on Docsity!

University of Wisconsin Madison

Department of Computer Sciences

CS 760 Machine Learning

Spring 2017

Final Examination

Duration: 1 hour 15 minutes

One set of handwritten notes and calculator allowed.

Instructions: · Write your answers in the space provided. · Show your calculations LEGIBLY. · If you feel that a question is not fully specified, state any assumptions you need to make in order to solve the problem. · Use the backs of the sheets for scratch work ONLY. · Write all the final answers BELOW the questions. Answers written in the scratch sheets will NOT be considered.

Name :

UW ID:

Problem Score Max Score

Total

Problem 1: Decision trees and instance based learning 20 points

  1. Which of the following statements are true for BOTH decision trees and Naive Bayes classifiers (you may choose more than one statement)? Explain (4 points) a) In both classifiers a pair of features are assumed to be independent b) In both classifiers a pair of features are assumed to be dependent c) In both classifiers a pair of features are assumed to be independent given the class label d) In both classifiers a pair of features are assumed to be dependent given the class label
  2. Consider the following training set in 2 dimensional Euclidean space: (6 points)

x y Class

-1 1 -

0 1 +

0 2 -

1 -1 -

1 0 +

1 2 +

2 2 -

2 3 +

a) What is the prediction of a 3 nearest neighbor classifier at the point (1,1)?

Instance 6 8 4 -

Problem 2 - Neural Networks: 30 points a) State whether the following statements are true or false and explain why. (12 points) i) A Perceptron can learn to correctly classify the following data, where each consists of three binary input values and a binary classification value: (111,1), (110,1), (011,1), (010,0), (000,0).

ii) The Perceptron Learning Rule is a sound and complete method for a Perceptron to learn to correctly classify any two-class problem.

iii) Training neural networks has the potential problem of overfitting the training data.

Assume initial weights to be 0 and learning rate to be 1.0. (6 points)

Problem 3 20 points

Briefly describe the following:

i) Pruning a decision tree

ii) Auto encoders

iii) Bagging

iv) Regularization

Problem 4 - Support Vector Machine 20 points

  1. What are the advantages/disadvantages of a non-linear SVM? Give examples to justify your reasoning.
  2. What is a kernel function? Why do we need it?
  3. Given the following data samples (square and triangle mean two classes), which one(s) of the following kernels can we use in SVM to separate the two classes?

a) Linear kernel b) Polynomial kernel c) Gaussian RBF (radial basis function) kernel d) None of the above

  1. How does the margin ρ relate to the weight vector w? Express the relation using a formula.