AI-Native Glossary
Essential terminology for the new era of AI-assisted software development.
Agentic AI
AI systems that can autonomously plan, reason, and execute multi-step tasks with minimal human intervention. Agentic AI goes beyond simple prompting to actively pursue goals, use tools, and adapt its approach based on results.
AI-Native Development
A software development paradigm that integrates AI assistance throughout the entire development lifecycle, not as an add-on tool but as a fundamental part of how software is designed, written, tested, and maintained.
AI-Ready Specification
A structured, machine-readable document designed to guide AI coding assistants with the precision and clarity they need to generate correct, consistent code implementations.
Context Persistence
The ability to maintain AI assistant memory and understanding across sessions, conversations, and projects. Context persistence transforms transient AI interactions into continuous collaboration.
Context Switching
The cognitive and computational cost of shifting between different tasks, files, or conversation topics when working with AI coding assistants. Context switching affects both human productivity and AI performance.
Context Window
The maximum amount of text an AI model can process in a single interaction, measured in tokens. It determines how much code, documentation, or conversation history an AI assistant can consider at once.
MCP (Model Context Protocol)
An open standard that defines how AI models connect to external tools, data sources, and systems. MCP provides a universal way for AI assistants to interact with databases, APIs, and development tools.
Prompt Engineering
The practice of crafting instructions and context to guide AI models toward desired outputs. In software development, prompt engineering has evolved into context engineering, managing the information AI assistants need to generate accurate code.
RAG (Retrieval-Augmented Generation)
A technique that extends AI capabilities by dynamically retrieving relevant information from external sources, allowing models to access knowledge beyond their training data and current context window.
Token Limit
The maximum number of tokens an AI model can process, affecting how much code or conversation it can handle. Understanding token limits helps teams budget their AI interactions effectively.