Save on skills. Reach your goals from $11.99

Introduction to Transformer for NLP with Python

Last updated on April 12, 2024 10:03 pm
Category:

Description

What you’ll learn

  • Chunking
  • Bag of Words
  • Hugging Face transformer
  • POS tagging
  • TF-IDF
  • GPT-2
  • Token Classification
  • BERT
  • Stemming
  • Lemmatization
  • NER
  • Preprocessing data
  • Attention
  • Fine-tuning

Interested in the field of Natural Language Processing  (NLP)? Then this course is for you!

Ever since Transformers arrived on the scene, deep learning hasn’t been the same.

  • Machine learning is able to generate text essentially indistinguishable from that created by humans

  • We’ve reached new state-of-the-art performance in many NLP tasks, such as machine translation, question-answering, entailment, named entity recognition, and more

In this course, you will learn very practical skills for applying transformers, and if you want, the detailed theory behind how transformers and attention work.

There are several reasons why this course is different from any other course. The first reason is that it covers all basic natural language process techniques, so you will have an understanding of what natural language processing is. The second reason is that it covers GPT-2, NER, and BERT which are very popular in natural language processing. The final reason is that you will have lots of practice projects with detailed explanations step-by-step notebook so you can read it when you have free time.

The course is split into 4 major parts:

  1. Basic natural language processing

  2. Fundamental Transformers

  3. Text generation with GPT-2

  4. Text classification

PART 1: Using Transformers

In this section, you will learn about the fundamental of the natural language process. It is really important to understand basic natural language processing before learning transformers. In this section we will cover:

  1. What is natural language processing (NLP)

  2. What is stemming and lemmatization

  3. What is chunking

  4. What is a bag of words?

In this section, we will build 3 small projects. These projects are:

  1. Gender identification

  2. Sentiment analyzer

  3. Topic modelling

PART 2: Fundamental transformer

In this section, you will learn how transformers really work. We will also introduce the new concept called Hugging face transformer and GPT-2 to have a big understanding of how powerful the transformer is.

In this section, we will implement two projects.

  • IMDB project

  • Q&A project implementation

PART 3: Project: Text generation with GPT-2

In this project, we will generate text with GPT-2. This is a project for us to practice and reinforce what we have learned so far. It will also demonstrate how text is generated quickly with a transformer.

PART 4: Token classification.

In this section, we will learn how to classify a text using a transformer. We will also learn about NER which is also popular in transformers.  The main project in this section is about Q &A project and  it will be more advanced than the previous Q & A project.

Who this course is for:

  • Anyone interested in Deep Learning, Machine Learning and Artificial Intelligence
  • Anyone passionate about Artificial Intelligence
  • Anyone interested in Natural Language Processing
  • Data Scientists who want to take their AI Skills to the next level

Reviews

There are no reviews yet.

Be the first to review “Introduction to Transformer for NLP with Python”

Your email address will not be published. Required fields are marked *