Udemy – Data Science: Transformers for Natural Language Processing 2022
Data Science

Udemy – Data Science: Transformers for Natural Language Processing 2022

Description

Data Science: Transformers for Natural Language Processing The name of the training course Data Science: Transformers for Natural Language Processing published by the Udemy website.Welcome to the Data Science: Transformers for Natural Language Processing course. Deep learning hasn’t been the same since Transformers hit the scene. Machine learning is capable of producing text that is essentially indistinguishable from the human-generated text.

We have achieved advanced performance in many NLP tasks, such as machine translation, question answering, implicating, named entity recognition, and more. Am. We have created multifaceted models (text and image) that can create amazing art using just a text message. We have solved an old problem in molecular biology known as “protein structure prediction”. In this course, you will learn very practical skills for using transformers and, if you wish, detailed theory on how transformers work and attention. This differs from many other sources that only cover the former.

  • Use of transformers
  • Fine-tuning transformers
  • Transformer in depth

Section 1: Using Transformers In this section, you will learn how to use the transformers that have been taught to you. It costs millions of dollars to do this, so it’s not something you want to try on your own! We will see how these pre-built models can be used for a wide range of tasks, including text classification (e.g. spam detection, sentiment analysis, document classification) named entity recognition summarization Machine translated text, questions, and answers, and text generation (believable). Masked Language Modeling (paper rotation) Zero-Shot Classification This is currently very practical.

If you need to perform sentiment analysis, document categorization, entity detection, translation, summarization, etc. on documents at work or for your clients – now you have the most powerful advanced models with very few.

Part 2: Fine-Tuning Transformers In this part, you’ll learn how to improve the performance of transformers in your custom data set. With Transfer Learning, you can leverage the millions of training dollars that have already gone into making transformers work so well. You will find that you can fine-tune a transformer with relatively little work (and low cost). We’ll cover how to fine-tune transformers for the most practical real-world tasks, such as text classification (sentiment analysis, spam detection), entity detection, and machine translation.

Part 3: Transformers in Depth In this part, you will learn how transformers work. The earlier parts are nice, but a little too nice. Libraries are fine for people who just want to get the job done, but won’t work if you want to do something new or interesting. Let’s be clear: it’s very practical. How practical is that, you might ask? Well, this is where the big money is.

Those who have a deep understanding of these models and can do things no one has done before are in a position to command higher salaries and prestigious titles. Machine learning is a competitive field, and a deep understanding of how things work can be the edge you need to excel. We will also examine how to implement transformers from scratch. As the great Richard Feynman once said, “I don’t understand what I can’t create.”

Suggested prerequisites: Adequate Python coding skills Deep learning with CNN and RNN is helpful but not necessary Deep learning with Seq2Seq models is helpful but not necessary for the deep part: Understanding the theory behind CNN, RNN, and seq2seq is very helpful. Updates to look forward to More detailed tuning applications Deeper concept lectures on Transformers running from scratch.

Who is this course suitable for?

  • Anyone who wants to master Natural Language Processing (NLP).
  • Anyone who loves deep learning and wants to learn about the most powerful neural network (transformer).
  • Anyone who wants to go beyond the typical beginner-only courses on Udemy

What you will learn in the Data Science: Transformers for Natural Language Processing course:

  • Apply transformers to real-world tasks with just a few lines of code
  • Adjust transformers on your dataset with transfer learning
  • Sentiment analysis, spam detection, NER (Named Entity Recognition) text classification,
  • Labeling parts of speech
  • Create your own article spinner for SEO
  • Produce believable human-like text
  • Neural machine translation and text summarization
  • Q&A (eg SQuAD)
  • Zero-shot classification
  • Pay attention to yourself and understand the deep theory of transformers
  • Run transformers from scratch
  • Use transformers with Tensorflow and PyTorch
  • Know BERT, GPT, GPT-2, and GPT-3 and where to apply them
  • Understand encoder, decoder, and seq2seq architecture
  • Master the Python Hagging Face library

Course details:

  • Publisher: Udemy
  • Instructor: Lazy Programmer Team, Lazy Programmer Inc
  • English language
  • Training level: introductory to advanced
  • Number of courses: 85
  • Training duration: 12 hours 13 minutes

Chapters of Data Science: Transformers for Natural Language Processing course :

Data Science

 

Prerequisites of the Data Science: Transformers for Natural Language Processing course:

  • Install Python, it’s free!
  • Beginner and intermediate level content: Decent Python programming skills
  • Expert level content: Good understanding of CNNs and RNNs and ability to code in PyTorch or Tensorflow

Course pictures:

Data Science

Installation guide :

After extracting, watch with your favorite player.

English subtitle

Quality: 720p

download link

Download part 1 – 1 GB
Download part 2 – 1 GB
Download part 3 – 703 MB
file password link
Follow On Facebook
Follow On Linkedin
Follow On Reddit