2017-09-07

HW01: Winnow Update Rule.

In class, the perceptron update rule was defined as:
.
  • On a correct prediction, do nothing.
  • On a false positive prediction, set `vec{w} := vec{w} - vec{x}` and set `theta := theta + 1`.
  • On a false negative prediction, set `vec{w} := vec{w} + vec{x}` and set `theta := theta - 1`.
.
In contrast, for the Winnow update rule, it says:
.
  • On a correct prediction, do nothing.
  • On a false positive prediction, for all `i`, set `w_i := alpha^{-x_i}w_i`.
  • On a false negative prediction, for all `i`, set `w_i := alpha^{x_i}w_i`.
.
My assumption would have been that `\theta` gets updated the same for both the Winnow and Perceptron update rules. Is this an incorrect assumption?
(Edited: 2017-09-07)
In class, the perceptron update rule was defined as: . * On a correct prediction, do nothing. * On a false positive prediction, set @BT@vec{w} := vec{w} - vec{x}@BT@ and set @BT@theta := theta + 1@BT@. * On a false negative prediction, set @BT@vec{w} := vec{w} + vec{x}@BT@ and set @BT@theta := theta - 1@BT@. . In contrast, for the Winnow update rule, it says: . * On a correct prediction, do nothing. * On a false positive prediction, for all @BT@i@BT@, set @BT@w_i := alpha^{-x_i}w_i@BT@. * On a false negative prediction, for all @BT@i@BT@, set @BT@w_i := alpha^{x_i}w_i@BT@. . My assumption would have been that @BT@\theta@BT@ gets updated the same for both the Winnow and Perceptron update rules. Is this an incorrect assumption?

-- HW01: Winnow Update Rule
For this update rule, both `alpha`, and `theta` are hyperparameters. I.e., you set and forget them.
For this update rule, both @BT@alpha@BT@, and @BT@theta@BT@ are hyperparameters. I.e., you set and forget them.
X