12 Days of Christmas Lyrics Generated by an AI (Sort of)

While many of the more recent models generally perform pretty well for many of the tasks they are built for, I am more interested in the relatively small area of the graph where the model goofs i.e messes up.
Depending on the model and what it is supposed to do, these goofs can go very wrong, giving you results that range from weird, absurd and uncanny to hilarious and outright comical.
From time to time I will post some of these goofy mistakes I experience while playing around with these demos and pre-trained models.
Today, I present "The 12 days of Christmas" as generated by a GPT-2 model trained by Hugging Face using papers from Arxiv.org
The 12 Days of Christmas is a popular Christmas song and Hugging Face is a company that specializes in creating AI and machine learning solutions most notable for their work on a neural network architecture for NLP known as a "Transformer" which has grown in popularity in recent months.
According to the description for this particular pre-trained model (christened Arxiv-NLP), it is "built on the OpenAI GPT-2 model [and] the Hugging Face team has fine-tuned the small version [of GPT-2] on a tiny dataset (60MB of text) of Arxiv papers. The targeted subject is Natural Language Processing, resulting in a very Linguistics/Deep Learning oriented generation"
In other words, the model was trained using samples of text obtained from academic papers hosted on Arxiv.org (pronounced "archive dot org"), a repository of scientific papers popular in the machine learning community and the papers used to train this model specifically relate to the topic of Natural Language Processing hence the model would perform very well for topics related to deep learning.
Instead I chose to use it to generate a Christmas carol by supplying the first seven lines from the real song (12 to 7) and using the model to generate the remaining lines (6 to 1).
And here is what I got:
12 Days of Christmas (featuring GPT-2, Arxiv-NLP and I)
12 drummers drumming
11 pipers piping
10 lords a-leaping
9 ladies dancing
8 maids a-milking
7 swans a-swimming
6 sphinx a-thonging
5 dungans a-singing
4 maids a-snorting
3 heirs a-shoe-stitching
2 moles a-spankery
1 puss a-banging
(I couldn't stop laughing at some of the generations like "3 heirs a-shoe-stitching" (huh?)):
Here's what the original song says from lines 6 to line 1:
6 geese a-laying
5 gold rings
4 calling birds
3 french hens
2 turtle doves
And a partridge in a pear tree
Just in case you will like to try this for yourself, I will provide the link but first you should know that I did not generate all this in one go and your own results might differ a little or a lot.
You see, the way the demo works is that you provide some starting words or phrase(s) and then click a button to generate suggestions. The model suggests three words or phrases at a time and you have the option to select any one of the three or generate a new set of three suggestions. The suggestions you select will then impact on the subsequent suggestions.
Having said that, here is the link to the demo: https://transformer.huggingface.co/doc/arxiv-nlp
To generate a suggestion, write some words or a sentence and then hit the Tab key or click on "Trigger Autocomplete"
See other demos from Hugging Face here: https://transformer.huggingface.co/
For more on using the model demos, see this Medium article.
Comments
Post a Comment