Skip to content
Get Started. Free Consult
VibeZero/Resources/Glossary/Foundation Model
Glossary · AI & Development

Foundation Model

A large AI model pre-trained on broad data that can be adapted to many downstream tasks.

In detail

A foundation model is a large neural network trained on a massive, broad dataset that learns general patterns of language, code, images or other data. Examples include Anthropic's Claude, OpenAI's GPT, Google's Gemini, Meta's Llama and Mistral's models. They are called foundations because they form the base on which more specific applications are built through prompting, retrieval, fine-tuning or tool use. Choice of foundation model materially affects cost, latency, quality and data residency.

Why it matters for Australian business

For Australian SMBs the practical question is which foundation model to standardise on for production work. Considerations include cost per million tokens, context window size, structured-output support, data residency (some providers offer Australian regions), enterprise terms (whether your data trains future models) and benchmark performance on your specific task. We help businesses pick the right foundation model rather than the most-marketed one.

How we help with this

Related terms

← All glossary terms

Want to talk through how this applies to your business? Book a free consult.