• No products in the cart.

Students

50

Lectures

121

Course Duration

Unlimited Duration

Updated

August 28, 2022

Natural language processing (NLP) refers to the branch of computer science—and more specifically, the branch of artificial intelligence or AI—concerned with giving computers the ability to understand text and spoken words in much the same way human beings can.
Profile Photo
Annapurna Singh
0
24

Courses

490

Students

A language that has developed and evolved naturally, through use by human beings, as opposed to an invented or constructed language, as a computer programming language (often used attributively): Natural language is characterized by ambiguity that artificial intelligence struggles to interpret.

The main benefit of NLP is that it improves the way humans and computers communicate with each other. The most direct way to manipulate a computer is through code -- the computer's language. By enabling computers to understand human language, interacting with computers becomes much more intuitive for humans.

Human language is filled with ambiguities that make it incredibly difficult to write software that accurately determines the intended meaning of text or voice data. Homonyms, homophones, sarcasm, idioms, metaphors, grammar and usage exceptions, variations in sentence structure—these just a few of the irregularities of human language that take humans years to learn, but that programmers must teach natural language-driven applications to recognize and understand accurately from the start, if those applications are going to be useful.

Several NLP tasks break down human text and voice data in ways that help the computer make sense of what it's ingesting. Some of these tasks include the following:

  • Speech recognition, also called speech-to-text, is the task of reliably converting voice data into text data. Speech recognition is required for any application that follows voice commands or answers spoken questions. What makes speech recognition especially challenging is the way people talk—quickly, slurring words together, with varying emphasis and intonation, in different accents, and often using incorrect grammar.
  • Part of speech tagging, also called grammatical tagging, is the process of determining the part of speech of a particular word or piece of text based on its use and context. Part of speech identifies ‘make’ as a verb in ‘I can make a paper plane,’ and as a noun in ‘What make of car do you own?’
  • Word sense disambiguation is the selection of the meaning of a word with multiple meanings  through a process of semantic analysis that determine the word that makes the most sense in the given context. For example, word sense disambiguation helps distinguish the meaning of the verb 'make' in ‘make the grade’ (achieve) vs. ‘make a bet’ (place).
  • Named entity recognition, or NEM, identifies words or phrases as useful entities. NEM identifies ‘Kentucky’ as a location or ‘Fred’ as a man's name.
  • Co-reference resolution is the task of identifying if and when two words refer to the same entity. The most common example is determining the person or object to which a certain pronoun refers (e.g., ‘she’ = ‘Mary’),  but it can also involve identifying a metaphor or an idiom in the text  (e.g., an instance in which 'bear' isn't an animal but a large hairy person).
  • Sentiment analysis attempts to extract subjective qualities—attitudes, emotions, sarcasm, confusion, suspicion—from text.
    • Theory Unlimited
    • Why is Natural Language Processing so difficult? Unlimited
    • NLP Components Unlimited
    • Stemming and Lemmatization Unlimited
    • Stop Words Unlimited
    • Parts-Of-Speech (POS) Tagging and Chunking Unlimited
    • Named Entity Recognition (NER) Unlimited
    • AIM Unlimited
    • Summary Unlimited
    • MODULE 1. INTRODUCTION TO NLP QUIZ 00:05:00
    • Programming Assignment 1 week, 3 days
    • Theory Unlimited
    • Word Vectorization (Word Embedding) Unlimited
    • Term Frequency – Inverse Document Frequency (TF – IDF) Unlimited
    • AIM Unlimited
    • Summary Unlimited
    • MODULE 2. WORD VECTORIZATION QUIZ 00:05:00
    • Programming Assignment 1 week, 3 days
    • Theory Unlimited
    • How Does Text Classification Work? Unlimited
    • Text Classification with Machine Learning Unlimited
    • Logistic Regression Unlimited
    • Evaluation of Classification Model Unlimited
    • Problem Statement Unlimited
    • Support Vector Machines Unlimited
    • SUMMARY Unlimited
    • Module 3. Text Classification with Machine Learning – I Quiz 00:05:00
    • Programming Assignment 1 week, 3 days
    • Theory Unlimited
    • Euclidean Distance Unlimited
    • Choosing the value of K Unlimited
    • Pros and Cons of KNN Algorithm Unlimited
    • The problem statement Unlimited
    • How Naive Bayes algorithm work on text classification? Unlimited
    • SUMMARY Unlimited
    • Programming Assignment 1 week, 3 days
    • Theory Unlimited
    • Why Word2Vec? Unlimited
    • Visual Representation of Word2Vec Unlimited
    • CBOW (Continuous Bag of Words) Unlimited
    • How does algorithm calculate the distance between words? Unlimited
    • Word2Vec Practical Implementation Unlimited
    • Word2Vec with genism Unlimited
    • Tokenization Unlimited
    • Building Word2Vec by using the Skip Gram method Unlimited
    • Case Study Unlimited
    • Treat Missing Data Unlimited
    • Data preprocessing Unlimited
    • Visualize Word2Vec Embeddings Unlimited
    • Start Recommending Products Unlimited
    • Summary Unlimited
    • Module 5.Word Embedding Quiz 00:05:00
    • Programming Assignment 1 week, 3 days
    • Theory Unlimited
    • What is Deep Learning? Unlimited
    • What is Artificial Neural Network? Unlimited
    • Why is Deep Learning Important? Unlimited
    • McCulloch and Pitts Neuron Unlimited
    • Feed Forward ANN Unlimited
    • Weights Initialization Techniques Unlimited
    • Why do we need non-linear activation functions? Unlimited
    • SUMMARY Unlimited
    • Module 6. Deep Learning Quiz 00:05:00
    • Theory Unlimited
    • How Back Propagation Algorithm Works Unlimited
    • Contribution of each Weight and Bias on the Error Unlimited
    • Optimizers Unlimited
    • RECURRENT NEURAL NETWORKS Unlimited
    • What are Recurrent Neural Networks (RNNs)? Unlimited
    • Different types of RNN Unlimited
    • Understanding a Recurrent Neuron in Detail Unlimited
    • Back propagation through TIME of RNN Unlimited
    • SUMMARY Unlimited
    • Module 7. Deep Learning – Part II Quiz 00:05:00
    • Theory Unlimited
    • Recurrent Neural Networks (RNNs) Unlimited
    • RNNs and Vanishing gradients Unlimited
    • Detailed Explanation o LSTM architecture Unlimited
    • LSTM Model Unlimited
    • LSTM model derivation Unlimited
    • BiLSTM Unlimited
    • Problem Statement Unlimited
    • Prediction on the test dataset Unlimited
    • SUMMARY Unlimited
    • Theory Unlimited
    • Introduction Unlimited
    • Sequence Modelling Problems Unlimited
    • The Architecture of Encoder-Decoder models Unlimited
    • The Neural Machine Translation Problem Unlimited
    • High-Level Overview Unlimited
    • What is under the hood? Unlimited
    • The Encoder Block Unlimited
    • The Decoder Block Unlimited
    • Training and Testing Phase Unlimited
    • The Embedding Layer Unlimited
    • The Final Visualization at test time Unlimited
    • Algorithm Unlimited
    • SUMMARY Unlimited
    • Module 9. Sequence-to-Sequence modeling Quiz 00:05:00
    • Programming Assignment 1 week, 3 days
    • Theory Unlimited
    • Introduction Unlimited
    • Global vs Local Attention Unlimited
    • Transformers – Attention is All You Need Unlimited
    • Calculating Self-Attention Unlimited
    • Understanding Transformer’s Model Architecture Unlimited
    • Transformer Limitations Unlimited
    • Understanding Transformer-XL Unlimited
    • Using Transformer for Language Modeling Unlimited
    • Problem Statement Unlimited
    • SUMMARY Unlimited
    • Module 10. Attention & Transformers Quiz 00:05:00
    • Programming Assignment 1 week, 3 days
    • Theory Unlimited
    • Introduction Unlimited
    • What is BERT? Unlimited
    • Ener ELMO and ULMFiT Unlimited
    • How Does BERT Work? Unlimited
    • SUMMARY Unlimited
    • Module 11. BERT Quiz 00:05:00
    • Programming Assignment 1 week, 3 days

Popular Courses

top
Managed by Ernesto.net © 2021. All rights reserved.