Alec Radford

(9 articles)
Pretrained Model
2013

Pretrained Model

ML model that has been previously trained on a large dataset and can be fine-tuned or used as is for similar tasks or applications.

Generality: 860

LLM (Large Language Model)
2018

LLM
Large Language Model

Advanced AI systems trained on extensive datasets to understand, generate, and interpret human language.

Generality: 827

Next Token Prediction
2018

Next Token Prediction

Technique used in language modeling where the model predicts the following token based on the previous ones.

Generality: 735

GPT (Generative Pre-Trained Transformer)
2018

GPT
Generative Pre-Trained Transformer

Type of neural network architecture that excels in generating human-like text based on the input it receives.

Generality: 811

DLMs (Deep Language Models)
2018

DLMs
Deep Language Models

Advanced ML models designed to understand, generate, and translate human language by leveraging DL techniques.

Generality: 874

CLIP (Contrastive Language–Image Pre-training)
2021

CLIP
Contrastive Language–Image Pre-training

Machine learning model developed by OpenAI that learns visual concepts from natural language descriptions, enabling it to understand images in a manner aligned with textual descriptions.

Generality: 399

Text-to-Code Model
2021

Text-to-Code Model

AI models designed to translate natural language descriptions into executable code snippets, facilitating automation in software development and assisting developers.

Generality: 665

Foundation Model
2021

Foundation Model

Type of large-scale pre-trained model that can be adapted to a wide range of tasks without needing to be trained from scratch each time.

Generality: 835

LVLMs (Large Vision Language Models)
2023

LVLMs
Large Vision Language Models

Advanced AI systems designed to integrate and interpret both visual and textual data, enabling more sophisticated understanding and generation based on both modalities.

Generality: 675