Natural Language Processing and NLU (natural language understanding) are quite similar, they're more like brothers and sisters. It's like NLP might include NLU in it.
Example NLP problems
- Classification —> What tags should this article have? (multiple label options per sample) [Text Classification].

- Text Generation —> Making your model to learn all the Shakespeare's work and use our model to generate new Shakespeare's poems / text.

-
Machine Translation —> Like we input one sequence of words and it gives us an another sequence of same words with translation.
Many NLP and NLU problems are referred to as sequence to sequence problems.

- Voice Assistant —> Takes in one sequence (sound waves) and convert it into texts. After the conversion find meaning of the derive information from that text.


Other Sequence problems.


- Many to one —> Many sequence to one output.
- Sentiment Analysis, given a set of sequence of words our model will analyze the sentiments of the sequence's the output will be one (Positive or Negative).
- Time Series Forecasting, our input would be historical prices of bitcoin and the predicted output will be the price on our desired day.


- Many to many —> Many sequence results to many outputs.
- Here it's not synchronised, the best example is the Machine Translation. Input will be multiple words and it spits out the same sequence of outputs.
- In NLP words are also known as Tokens
- Many to many (synchronised)
NLP Inputs and Outputs

- At first convert whatever the text (sequence) available into a numerical encoding. We will perform tokenization and embedding in here to turn them into numerical encodings.
- Then we will send our inputs into machine learning algorithm so that our model can learn patterns in our data.
- Output will be an prediction probability of two classes namely in our case Disaster and Not Disaster.
- The predicted output / examples comes from seeing lot of actual outputs.