AI Terms Glossary: Your guide to ChatGPT and all things conversational AI

The Moveworks Team

Forget crypto and blockchain. ChatGPT is the new tech in town, and you need to know the lingo.

Large language models. Neural networks. Prompt engineering. NLP. NLU. NLG. The buzz around this cutting-edge technology is everywhere, and it's crucial to understand the key AI terms and phrases if you want to keep up. But, with a sea of acronyms and jargon, it can be overwhelming.

Don't worry. We've got you covered. Our AI terms glossary is designed to help you navigate the fast-paced world of AI and engage in conversations like a pro. We'll continue to update this list of definitions as more need-to-know terms come up, so bookmark this glossary and keep it in your back pocket. Soon enough, you'll be ready to impress with your newfound knowledge of all things conversational AI.

 

Artificial intelligence glossary (2023)

Artificial Intelligence (or “AI”)

The simulation of human intelligence in machines that are programmed to think and learn like humans. Example: A self-driving car that can navigate and make decisions on its own using AI technology.

Chatbots

A user-friendly interface that allows the user to ask questions and receive answers. Depending on the backend system that fuels the chatbot, it can be as basic as pre-written responses to a fully conversational AI that automates issue resolution. 

ChatGPT

A chat interface built on top of GPT-3.5. GPT-3.5 is a large language model developed by OpenAI that is trained on a massive amount of internet text data and fine-tuned to perform a wide range of natural language tasks. Example: GPT-3.5 has been fine-tuned for tasks such as language translation, text summarization, and question answering.

Conversational AI

A subfield of AI that focuses on developing systems that can understand and generate human-like language and conduct a back-and-forth conversation. Example: A chatbot that can understand and respond to customer inquiries in a natural and human-like manner.

Deep Learning

A subfield of ML that uses neural networks with multiple layers to learn from data. Example: A deep learning model that can recognize objects in an image by processing the image through multiple layers of neural networks.

Fine-tuning

The process of adapting a pre-trained model to a specific task by training it on a smaller dataset. Let me explain with an example from image classification. An image classification model trained on all intersection pictures can be fine turned to detect when a car runs a red light. At Moveworks, we’ve been fine-tuning LLMs for enterprise support for years. 

Discriminative Models

Models that classify a data example and predict a label. For example, a model that identifies whether a picture is a dog or a cat.  

Generative Models

Models that generate new data by discovering patterns in data inputs or training data. For example, creating an original short story based on analyzing existing, published short stories. 

Generative Pre-trained Transformer (or “GPT”)

A type of deep learning model trained on a large dataset to generate human-like text, the underlying architecture of ChatGPT

GPT-3

GPT-3 is the 3rd version of the GPT-n series of models. It has 175 billion parameters — knobs that can be tuned — with weights to make predictions. Chat-GPT uses GPT-3.5, which is another iteration of this model. 

Large Language Model (or “LLM”)

A type of deep learning model trained on a large dataset to perform natural language understanding and generation tasks. There are many famous LLMs like BERT, PaLM, GPT-2, GPT-3, and the groundbreaking GPT3.5. All of these models vary in size (number of parameters that can be tuned), in the breadth of tasks (coding, chat, scientific, etc.), and in what they're trained on. 

Machine Learning (or “ML”)

A subfield of AI that involves the development of algorithms and statistical models that enable machines to improve their performance with experience. Example: A machine learning algorithm that can predict which customers are most likely to churn based on their past behavior.

N-Shot Learning

Zero/Single/Few shot learning are variations of the same concept – providing a model with little or no training data to classify new data and guide predictions. A “shot” represents a single training example. Fun fact: Within the GPT prompt, you can ask for “N” examples to improve the accuracy of the response.  

Natural Language Generation (or “NLG”)

A subfield of AI that produces natural written or spoken language. 

Natural Language Processing (or “NLP”)

A subfield of AI that involves programming computers to process massive volumes of language data. Focuses on transforming free-form text into a standardized structure. 

Natural Language Understanding (or “NLU”)

 A subtopic of NLP that analyzes text to glean semantic meaning from written language. That means understanding context, sentiment, intent, etc. 

Neural Network

A machine learning model inspired by the human brain's structure and function that's composed of layers of interconnected nodes or "neurons." Example: A neural network that can recognize handwritten digits with high accuracy.

OpenAI

The organization that developed ChatGPT. More broadly speaking, OpenAI is a research company that aims to develop and promote friendly AI responsibly. Example: OpenAI's GPT-3 model is one of the largest and most powerful language models available for natural language processing tasks.

Optimization

The process of adjusting the parameters of a model to minimize a loss function that measures the difference between the model's predictions and the true values. Example: Optimizing a neural network's parameters using a gradient descent algorithm to minimize the error between the model's predictions and the true values.

Overfitting

A problem that occurs when a model is too complex, performing well on the training data but poorly on unseen data. Example: A model that has memorized the training data instead of learning general patterns and thus performs poorly on new data.

Pre-training

Training a model on a large dataset before fine-tuning it to a specific task. Example: Pre-training a language model like ChatGPT on a large corpus of text data before fine-tuning it for a specific natural language task such as language translation.

Prompt Engineering

Identifying inputs — prompts — that result in meaningful outputs. As of now, prompt engineering is essential for LLMs. LLMs are a fusion of layers of algorithms and, consequently, have limited controllability with few opportunities to control and override behavior. An example of prompt engineering is providing a collection of templates and wizards to direct a copywriting application.  

Reinforcement Learning

A type of machine learning in which a model learns to make decisions by interacting with its environment and receiving feedback through rewards or penalties. GPT uses reinforcement learning from human feedback. When tuning GPT-3, human annotators provided examples of the desired model behavior and ranked outputs from the model. 

Sequence Modeling

A subfield of NLP that focuses on modeling sequential data such as text, speech, or time series data. Example: A sequence model that can predict the next word in a sentence or generate coherent text.

Supervised Learning

A type of machine learning in which a model is trained on labeled data to make predictions about new, unseen data. Example: A supervised learning algorithm that can classify images of handwritten digits based on labeled training data.

Tokenization

The process of breaking text into individual words or subwords to input them into a language model. Example: Tokenizing a sentence "I am ChatGPT" into the words: “I,” “am,” “Chat,” “G,” and “PT.”

Transformer

A type of neural network architecture designed to process sequential data, such as text. Example: The transformer architecture is used in models like ChatGPT for natural language processing tasks.

Unsupervised Learning

A type of machine learning in which a model is trained on unlabeled data to find patterns or features in the data. Example: An unsupervised learning algorithm that can cluster similar images of handwritten digits based on their visual features.

Ready to go deeper into the world of conversational AI?

We hope this AI terms glossary has provided a comprehensive overview of the key concepts in the field of conversational AI. If you're looking to delve deeper and continue exploring the possibilities of AI in IT, numerous resources are available for further study. From academic research and social posts to blogs and podcasts, the opportunities for growth and discovery in the world of AI are truly endless. 

So why not take the first step and start your journey into the fascinating world of conversational AI today? See how Moveworks can supercharge your workforce with AI — request a demo.


ChatGPT has changed the game. Here’s what you need to know about the new AI stack.