Introduction


Figure 1

NLP is an interdisciplinary field, and LLMs are just a subset of it
NLP is an interdisciplinary field, and LLMs are just a subset of it

Figure 2

ChatGPT Just Works! Does it…?
ChatGPT Just Works! Does it…?

Figure 3

GPT-4 Tokenization Example
GPT-4 Tokenization Example

Figure 4

Unsupervised Learning
Unsupervised Learning

Figure 5

Supervised Learning
Supervised Learning

Figure 6

A taxonomy of NLP Tasks
A taxonomy of NLP Tasks

Figure 7

Levels of Language
Levels of Language

Figure 8

Diagram showing building blocks of language
Diagram showing building blocks of language

From words to vectors


Figure 1


Figure 2


Figure 3


Figure 4


Figure 5


Transformers: BERT and BeyondTransformersBERTBERT ArchitectureBERT as a Language ModelBERT for Text ClassificationBERT for Token Classification


Figure 1

Transformer Architecture
Transformer Architecture

Figure 2

BERT Architecture
BERT Architecture

Figure 3

The Encoder Self-Attention Mechanism
The Encoder Self-Attention Mechanism

Figure 4

BERT Language Modeling
BERT Language Modeling

Figure 5

BERT as an Emotion Classifier
BERT as an Emotion Classifier

Figure 6

BERT as an Emotion Classifier
BERT as an Emotion Classifier

Figure 7

BERT as an NER Classifier
BERT as an NER Classifier

Episode 3: Using large language modelsReferences


Figure 1

Company ACompany BCompany CCompany CCompany CCompany C


Figure 2

Company DCompany ECompany FCompany C


Figure 3

LLMs table

Figure 4

llm engine analogy

Figure 5

llms vs bert

Figure 6

html to text processing

Figure 7

tokenization

Figure 8

training goal llms

Figure 9

tokenization

Figure 10

tokenization for conversation data