Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

DataCamp

Recurrent Neural Networks (RNNs) for Language Modeling with Keras

via DataCamp

Overview

Learn how to use RNNs to classify text sentiment, generate sentences, and translate text between languages.

Machine Learning models are based on numerical values to make predictions/classification, but how can computers deal with text data? With the huge increase of available text data, applications such as automatic document classification, text generation, and neural machine translation became possible. In this course, you will learn how to use Recurrent Neural Networks to classify text (binary and multiclass), generate phrases simulating the character Sheldon from The Big Bang Theory TV Show, and translate Portuguese sentences into English. Are you ready to start your journey into Language Models using Keras and Python? Dive in!

Syllabus

  • Recurrent Neural Networks and Keras
    • In this chapter, you will learn the foundations of Recurrent Neural Networks (RNN). Starting with some prerequisites, continuing to understanding how information flows through the network and finally seeing how to implement such models with Keras in the sentiment classification task.
  • RNN Architecture
    • You will learn about the vanishing and exploding gradient problems, often occurring in RNNs, and how to deal with them with the GRU and LSTM cells.
      Furthermore, you'll create embedding layers for language models and revisit the sentiment classification task.
  • Multi-Class Classification
    • Next, in this chapter you will learn how to prepare data for the multi-class classification task, as well as the differences between multi-class classification and binary classification (sentiment analysis). Finally, you will learn how to create models and measure their performance with Keras.
  • Sequence to Sequence Models
    • This chapter introduces you to two applications of RNN models: Text Generation and Neural Machine Translation. You will learn how to prepare the text data to the format needed by the models.
      The Text Generation model is used for replicating a character's way of speech and will have some fun mimicking Sheldon from The Big Bang Theory.
      Neural Machine Translation is used for example by Google Translate in a much more complex model. In this chapter, you will create a model that translates Portuguese small phrases into English.

Taught by

David Cecchini

Reviews

Start your review of Recurrent Neural Networks (RNNs) for Language Modeling with Keras

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.