Machine Learning Basics
Introduce in the basics of machine learning theory, laying down the common concepts and techniques involved.
30 sec
Per question
Average time
Contest Score
Community Rating
Eager Cobra judge
Not original test. A major part of the text is copied word-for-word from this whitepaper: No sources are credited.
Dark Dodo author
Objection, false claims, your link leads me to a site where I need to download .pdf file but before sign in via social network. Is it some kind of quiz book or what? Looks incredible fishy.
I insist on provide evidence of "word-for-word" copy. I agreed on unfortunate mistakes but not this.
Eager Cobra judge
Dark Dodo Just google this query:"For example, machine learning is good at repeatable patterns but fails when something is new" and open the first link. It will show you (without the need to sign in) the source of "This information technology evolution comes on the back of explosion of data in a world gone digital" and other strange phrases from the questions and explanations.
Dark Dodo author
Eager Cobra Thank you for the link, but first of all "strange phrases" is your personal opinion. Secondly, this phrase looks similar and what? Explanation/Question in your opinion could be unique and do not match any existing phrase on the Internet?
English is not my native language, of cause I checked some sentences to make it sound more correct. But your .pdf It is neither a finished quiz nor even a textbook. 

I think the nit-picking for "similar sentences" is not applicable here.
Tall Panda judge
Dark Dodo This Quiz was fully re-assessed by an independent Judge.

The original verdict was confirmed to be correct.

In addition to the above issues:
Not an original test. Questions and answers were copied from an existing test. E.g.:
#q2: "Although machine learning is an interesting concept, there are limited business applications in which it is useful. Is it true or false?" <= (see q.1),
#q7: "Which of the following is a technique frequently used in machine learning?" <= (see q.7),
#q21: "In ensemble learning, you aggregate the predictions for weak learners, so that an ensemble of these models will give a better prediction than prediction of individual models. They don’t usually overfit and have high bias. Is it true?" <= (q.21 with little modifications),
#q22: "Suppose we have a dataset which can be trained with 100% accuracy with help of a decision tree of depth 8. Depth 5 will have high bias and low variance. Is it correct?" <= (q.34 with little modifications),
#q28: "What learning rate presented at the picture is the best for a neural network?" <= (q2.a, p.4 with little modifications).
Cuddly Gorilla
Question 19: in a previous question it's said that overfitting happens when the model is too complex. So why should I reduce the model complexity in case of underfitting? Isn't it the opposite?
Dark Dodo author
Thank you, it was an unfortunate mistake, of course it's overfitting. In case of underfitting we need to increase model complexity (or its type). Need to fix it.
Cuddly Gorilla
Dark Dodo hope they will let edit in the phase 2
Dark Dodo author
Cuddly Gorilla hope so
Cuddly Gorilla
Question 2: there are "limited" business applications is a trivial question. They are pretty much unlimited, I mean there are really a lot. But of course there are some limits as in anything.
Dark Dodo author
It's just a start of the quiz, that's why the question is simple. They are limited, because you should not use ML everywhere in for any problem without proper examination.
Cuddly Gorilla
Dark Dodo ok, I'll try to rewrite it like "is ML suitable for every kind of problem" or something like that, maybe
Dark Dodo author
Cuddly Gorilla yep, will think about it
Cuddly Gorilla
I have found it less educational and more a knowledge challenge for people that study ML
Able Crow
Good quiz overall! I have few comments on some questions:
- Q2: i think that it is a subjective question
- Q6: i think that it is a subjective question, most answers are limitations of ML
- Q9: i don't understand why the good answer is Convolutional; internal memory is an attribute of recurrent NN; maybe i misunderstood the meaning of memory in the question?
- Q16: i disagree with the answer: the consequence of underfitting is high bias; the consequence of overfitting is high variance
Able Crow
Sorry, i forgot to provide references to my points:
- Q9:
- Q16: and a quote on this page: "This is because an underfit model has low variance and high bias."
Dark Dodo author
Able Crow First of all, thanks for your review.

Q9: The key moment here is not just a "learnt memory" but "learnt memory of historical results".

Convolutional neural network (CNN) using historical results to processing new data. CNN has no notion of order in time, and the only input it considers is the current example it has been exposed to. 

In case of recurrent neural networks they consider on each step decisions they have perceived previously in time (internal memory/state), not just historical results, i.e short-term memory. 

"Recurrent networks are distinguished from feedforward networks by that feedback loop connected to their past decisions, ingesting their own outputs moment after moment as input."

Q16+Q19: Yeah, completely messed up with underfitting/overfitting. Of course it's high bias. I had to double-check before sending. Even in explanation I write about "when model is too simple with regards to the data it's trying to model", i.e a simple model by definition cannot suffer from high variance.
Able Crow
Dark Dodo thank you for your explanation. I am still puzzled. I agree that CNN do not have notion of order in time and they only consider the current example. What they basically do is applying a sliding window over the sample and computing a convolution of the sliding window with the sample;
Still i don't understand how this is "learnt memory of historical results": if you consider 2 sliding windows on the same sample, just separated of a few points or pixels, the second sliding window does not learn from the first one sliding window
Take the quiz to leave a comment