In yesterday's live stream—Live 66—I shared the automation I have in place to prepare each live stream, walked through a p5.js template I created to export sketches as GIF animations, and did a hands-on coding exercise to use pre-trained text embeddings from TensorFlow Hub to train a sentiment analysis model to rate IMDb reviews from zero to one—lower numbers representing a bad review and numbers close to one meaning the review is saying good things about the show being rated.
TensorFlow Hub and TensorFlow Datasets make it extremely easy to get performing machine learning models in no time. The challenge is to prepare a custom dataset with data specific to the problem you're trying to solve.
More specifically, we used nnlm-en-dim50, a token-based text embedding trained on the English Google News 7B corpus dataset. Borrowing from Aurélien Géron's Hands-On Machine Learning book, "an embedding is a trainable dense vector that represents a category or token."
In this model, the trainable dense vector is a 50-dimension tensor of float numbers containing a set of features that we repurposed to train a review-rating model as the first layer of a Keras sequential model thanks to the the
hub.KerasLayer() method, after which we add two Dense layers—one of 128 dimensions using the ReLU activation function, another of 1 dimension using the Sigmoid activation function.
The model is compiled using the
binary_crossentropy loss, the Adam optimizer, and the
accuracy metric. We then fit the model for fifty epochs to an IMDb review dataset downloaded using TensorFlow Datasets (
The sentiment analysis notebook shows how the model rates a good and bad review after zero, five, and fifty epochs of training. As the model learns, it goes from not being able to discern whether a review is good or bad to guessing with great accuracy. I then grabbed a 10/10 and a 1/10 review from The Batman's page on IMDb and obtained great results with the model trained for fifty epochs.
You can spread the word by liking and sharing this tweet.
Thanks for watching.
See you next week!