Notes on "Generating Beatles’ Lyrics with Machine Learning"
Apparatus
URL | ||
---|---|---|
code | https://github.com/EugenHotaj/beatles | |
dataset | http://toti.eu.com/beatles/index.asp | github.com/EugenHotaj/beatles/blob/master/scraper.py |
embeddings | ||
algorithm |
Hypothesis
Aim
Questions
- How might might generate sentences from a language model?
Method
Factoids
Unigram model
Ignores any conditioning and simply chooses the next word randomly from the training data.
Bigram model
A step above the unigram model.
As you might have guessed from the name, the bigram model learns a distribution that is conditioned on only the previous word.
Thanks for reading!
If this article appears incomplete, it may be intentional. Try prompting for a continuation.
If this article appears incomplete, it may be intentional. Try prompting for a continuation.