Loading filters...
Chunking

Chunking

A concept in cognitive psychology and AI, where information is broken down and grouped into chunks to simplify complex data and optimize memory usage.

Chunking, or unitization, is a cognitive process that human beings and AI systems use to organize and manage large amounts of data or information. It works by splitting up complex data into smaller, more manageable pieces, known as 'chunks'. In AI, it is used to better comprehend and process complex datasets, especially in Natural Language Processing (NLP) and Machine Learning (ML) models. It helps these models understand and replicate human-like behavior more effectively by optimizing how they handle data and resources.

Historically, the term 'chunking' came into use in the mid-20th century. It was introduced by cognitive psychologist George A. Miller through his paper, "The Magical Number Seven, Plus or Minus Two," in 1956. In the context of AI, chunking gained popularity in the 1970s and 1980s, particularly in areas like NLP and expert systems, where it was used to represent and manipulate complex structured data.

George A. Miller is a key contributor to the concept of chunking in cognitive psychology. In the realm of AI, Herbert A. Simon and Allen Newell further developed the concept for its application in AI through their research in cognitive architectures and symbol systems. These contributors have played a significant role in shaping the way AI processes complex information, mimicking human cognitive processes.

Generality: 0.8