Quiz 6

Milestones

Quiz Deadline 2023-08-01T23:59:00-04:00
Late Deadline not applicable
Grading Deadline 2023-08-08T23:59:00-04:00

Rubric

All questions on the quizzes in this course are out of 3 points. When assessing these quizzes, score answers as follows:

  • 3 is reserved for excellent responses that that fully explain the answer to the question, show a true understanding of the question being asked, and don’t contain any errors. As a general guideline, we expect no more than 25-30% of answers are likely to receive a 3 on any particular question.
  • 2 is awarded for a good response that’s getting at the right idea, but might not explain it fully or might have a few minor errors. 2, indeed, is what the course expects most students are likely to earn. As a general guideline, we expect approximately 50-60% of answers are likely to receive a 2 on any particular question.
  • 1 is awarded for an inadequate response that misses the mark, is wholly incorrect, or otherwise fails to answer what the question is asking.
  • 0 is given to answers that are exceedingly brief, or blank, and as such are effectively non-responses.

The above rubric should be applied for all questions on the quiz.

If you give a score lower than full credit, please include a brief comment in the “Provide comments specific to this submission” section of the form. You can also re-use comments in the “Apply Previously Used Comments” to save time so that you don’t find yourself typing the same thing over and over!

Grading Responsibilities

Question 1 Doug
Question 2 Tom
Question 3 Ben
Question 4 Taha

Answer Key

Question 1

  1. Cats run.” can be derived. S → NP V → N V → Cats run.
  2. “Cats climb trees.” can’t be derived. “climb trees” is a verb phrase that contains a noun, but the grammar here only allows for a single word as the verb.
  3. “Small cats run.” can be derived. S → NP V → A NP V → A N V → Small cats run.
  4. “Small white cats climb.” can be derived. S → NP V → A NP V → A A NP V → A A N V → Small white cats climb.

Question 2

  1. Recurrent neural networks can process sequences of inputs by accepting each input one at a time and maintaining state to keep track of information. They can also produce sequences of inputs by generating one output value at a time.
  2. Attention mechanisms allow the network to focus on the parts of the input sequence that are most relevant when generating an output token. Different output tokens might depend more heavily on different input tokens, so attention allows the network to learn what to focus on in order to most accurately predict the correct output.

Question 3

  1. Bayes rule relates the conditional probability of A given B to the conditional probability of B given A. This is used in a Naive Bayes classifier since give it data about the probability of evidence given a classification, and use that to predict the probability of a classification given evidence.
  2. Without smoothing, Naive Bayes runs into problems when it finds evidence that has never appeared for a particular category, and therefore appears to have a probability of 0 of belonging to the category. In those cases, Naive Bayes would assume the category is impossible, ignoring all other evidence, since probabilities are multiplied together.
  3. Naive Bayes classifiers are “naive” in the sense that they assume that all of the evidence is conditionally independent given a particular classification. In practice, this probably isn’t true, but it makes for a useful simplifying assumption.

Question 4

  1. One-hot representations end up being much longer vectors: you need as many dimensions as you have words. Distributed representations can allow for shorter vectors, which are easier to work with mathematically.
  2. Distributed representations allow for similar words to have similar representations, which isn’t possible with a one-hot representation.