2017-10-30

Practice Midterm 2 Solutions.

Post the solutions here.
(Edited: 2017-10-30)
Post the solutions here.

-- Practice Midterm 2 Solutions
Avinash, Vasudha Resource Description for WhatsApp Image 2017-10-30 at 16.54.51.jpeg
Avinash, Vasudha ((resource:WhatsApp Image 2017-10-30 at 16.54.51.jpeg|Resource Description for WhatsApp Image 2017-10-30 at 16.54.51.jpeg))

-- Practice Midterm 2 Solutions
Resource Description for mid-term-Q7.jpg
((resource:mid-term-Q7.jpg|Resource Description for mid-term-Q7.jpg))

-- Practice Midterm 2 Solutions
Q7. by Kushal, Nishant and Abhinav
Q7. by Kushal, Nishant and Abhinav

-- Practice Midterm 2 Solutions
Q.9.
def leakyReluLayer(weights, inputs, biases, alpha):
    z = tf.matmul(weights, inputs) + biases
    zeros = tf.constant(0, tf.float32)
    result = tf.maximum(zeros, z) + alpha * tf.minimum(zeros, z)
    return result
- Anish & Thinh
(Edited: 2017-10-30)
Q.9. def leakyReluLayer(weights, inputs, biases, alpha): z = tf.matmul(weights, inputs) + biases zeros = tf.constant(0, tf.float32) result = tf.maximum(zeros, z) + alpha * tf.minimum(zeros, z) return result - Anish & Thinh

-- Practice Midterm 2 Solutions
Q6 by Jason Chee and Yeshwanth Balachander Resource Description for IMG_7099.JPG
Q6 by Jason Chee and Yeshwanth Balachander ((resource:IMG_7099.JPG|Resource Description for IMG_7099.JPG))

-- Practice Midterm 2 Solutions
2) Charles MacKay & Nick Mauthes Cross validation A technique used to evaluate the error of the trained model when our dataset is not large. There might be a case your model performs badly on one test set, but good on another test set. We can divide the dataset into subsets, and aggregate the results of each, by taking an average. This will be a better predictor of overall accuracy of the model. We can make a claim the model works well if we do cross validation and we still have a high average accuracy.
Exhaustive, we consider all possible ways to divide the dataset, if we partition a p subset. Example: leave-p-out , cycle over all possible p subsets, train on the data excluding p-subset, test on the p-subset. This is a combination of all possible subsets like n choose p
inexhaustive – doesn’t consider all possible subsets examples: k-fold – divide the dataset into k equal subsets. For S1, …, Sk, for I = 1 to k, train on all but Si, and test on Si. Aggregate the results
Repeated random sub sampling – randomly repeat m times, choose a subset of size p as the test set, train on all other data, test on subset p.
2) Charles MacKay & Nick Mauthes Cross validation A technique used to evaluate the error of the trained model when our dataset is not large. There might be a case your model performs badly on one test set, but good on another test set. We can divide the dataset into subsets, and aggregate the results of each, by taking an average. This will be a better predictor of overall accuracy of the model. We can make a claim the model works well if we do cross validation and we still have a high average accuracy. Exhaustive, we consider all possible ways to divide the dataset, if we partition a p subset. Example: leave-p-out , cycle over all possible p subsets, train on the data excluding p-subset, test on the p-subset. This is a combination of all possible subsets like n choose p inexhaustive – doesn’t consider all possible subsets examples: k-fold – divide the dataset into k equal subsets. For S1, …, Sk, for I = 1 to k, train on all but Si, and test on Si. Aggregate the results Repeated random sub sampling – randomly repeat m times, choose a subset of size p as the test set, train on all other data, test on subset p.

-- Practice Midterm 2 Solutions
Question 1 Amer Rez, Shashank Iyer, Venkat Raja Iyer from PIL import Image import numpy as np img = Image.open('bob.png') img = img.convert('L') nparray = np.fromstring(img.tobytes(), dtype = np.uint8) nparray = nparray.reshape(img.size[1], img.size[0])
(Edited: 2017-10-30)
<nowiki>Question 1 Amer Rez, Shashank Iyer, Venkat Raja Iyer from PIL import Image import numpy as np img = Image.open('bob.png') img = img.convert('L') nparray = np.fromstring(img.tobytes(), dtype = np.uint8) nparray = nparray.reshape(img.size[1], img.size[0])</nowiki>

-- Practice Midterm 2 Solutions
Question 10:
Team Members: Ujjwal Garg, Gaurav Gupta
Resource Description for Question 10 -Part1.jpeg Resource Description for Question 10 - Part 2.jpeg
'''Question 10: Team Members: Ujjwal Garg, Gaurav Gupta''' ((resource:Question 10 -Part1.jpeg|Resource Description for Question 10 -Part1.jpeg)) ((resource:Question 10 - Part 2.jpeg|Resource Description for Question 10 - Part 2.jpeg))

-- Practice Midterm 2 Solutions
Team Members: Ishan, Vyas, Aarish
import tensorflow as tf
def perceptron(weights, inputs, biases, activation):
    nodes = tf.matmul(weights, inputs) + biases
    return activation(nodes)
x = tf.placeholder(tf.float32, shape=(1,3))
W = tf.Variable([[0],[1],[0]], dtype=tf.float32) b = tf.Variable(0.3, dtype=tf.float32)
activation = tf.sigmoid my_layer = perceptron(W, x, b, activation)
Team Members: Ishan, Vyas, Aarish import tensorflow as tf def perceptron(weights, inputs, biases, activation): nodes = tf.matmul(weights, inputs) + biases return activation(nodes) x = tf.placeholder(tf.float32, shape=(1,3)) W = tf.Variable([[0],[1],[0]], dtype=tf.float32) b = tf.Variable(0.3, dtype=tf.float32) activation = tf.sigmoid my_layer = perceptron(W, x, b, activation)
[ Next ]
X