Announcing Built On Envoy: Making Envoy Extensions Accessible to Everyone
Built On Envoy is a community-driven marketplace and CLI that makes discovering, running, and sharing Envoy extensions effortless. Try it today!
Today we’re announcing the general availability of Built On Envoy, a community-driven marketplace and CLI tool that makes it effortless to discover, run, and share Envoy Proxy extensions. As of launch, there are already ready to use extensions for OpenAI Chat Completion Decoding, Azure Content Safety, AWS Bedrock Guardrails, OPA Policy Enforcement, Coraza WAF, SAML Authentication, and more to come soon!
Envoy’s advanced capabilities have remained accessible only to teams with deep, specialized expertise. Many organizations end up building custom extensions behind closed doors, duplicating effort and missing the chance to benefit from shared innovation. Built On Envoy eliminates the friction that has kept Envoy’s powerful extensibility out of reach for most developers: no custom builds, no hunting for extensions, no deep knowledge of C++ or Envoy internals required.
If you’re already sold, head over to the GitHub repo and start exploring. Otherwise, read on for the full story.
Why Envoy, and why now
Envoy Proxy was created at Lyft in 2015 and since then it has become the de facto standard for cloud-native networking. It powers some of the world’s most demanding networking and API gateway deployments, and major infrastructure at organizations like Google, Apple, Netflix, and Airbnb. Its performance characteristics — low latency, high throughput, efficient memory use — have made it the proxy of choice for platform teams at scale.
The most recent evidence of Envoy’s durability is the Envoy AI Gateway, which extends Envoy Gateway with native support for AI inference workloads. The addition of AI features on top of the battle-tested core, make Envoy a perfect choice for a production-ready and modern AI-capable proxy.
Many extensions ready to use
You can benefit immediately from the extensions already there. The list is always growing, a few highlights:
- Static file server: Use Envoy as a simple static file server to server. Thank you Rohit from Databricks for contributing this and other aspects of the Envoy ecosystem!
- Azure Content Safety: Common request from organizations running AI workloads in Azure, perfect complement to Envoy AI Gateway.
- Coraza WAF: Implements the entire OWASP Core Rule Set and allows configuring ModSecurity compatible SecLang rules. We’ll dive into this as an example later.
All these extensions are freely available, and ready to use now.
Tetrate’s role in the Envoy ecosystem
Tetrate has been a major contributor to the Envoy ecosystem from the start. Tetrate engineers are among the top contributors to Envoy itself and have played key roles in building Envoy Gateway and the Envoy AI Gateway. Tetrate has also driven the development of Envoy’s Dynamic Modules — the extension mechanism that Built On Envoy is built on — contributing the core SDK implementations in Rust and Go, and designing the ABI that makes runtime-loadable extensions possible.
This deep investment in Envoy’s extensibility story is what led us to build Built On Envoy. We saw firsthand that while the raw capabilities are there, the developer experience around extending Envoy was lagging far behind.
The gap Built On Envoy fills
Envoy is highly extensible — Lua, ext_authz, ext_proc, WASM, and now Dynamic Modules all provide mechanisms to customize its behavior.
But leveraging those mechanisms is not straightforward. Some require knowledge of Envoy internals. SDK maturity for languages other than C++ has been
an issue, and the overall developer UX was simply not there.
Three problems stand out:
- Discovery: There’s no central place to find and share extensions outside the official Envoy repository, and contributing to the official tree requires meeting a high bar and is gated by maintainer availability. In addition, vendor or product specific extensions don’t belong to the main repository, and there is no place where those can be shared with the community.
- Distribution: Even when you find an extension, figuring out how to build it, which Envoy version it’s compatible with, and how to deploy it is a challenge.
- Complexity: Creating, testing, and building a new extension demands deep knowledge of Envoy internals, its build system, and the specific extension mechanism being used. Developers spend more time fighting infrastructure than solving their actual problems.
Before diving into the solution, it helps to understand what Envoy offers and why extensibility is complex. Envoy supports several extension mechanisms, each designed for different use cases:
| Mechanism | Maturity | Performance | Evolution Speed | Deployment |
|---|---|---|---|---|
| C++ Filter | Mature | Highest | Slow (in-tree) | Custom build |
| Dynamic Module | Production-ready (v1.37) | Excellent | Fast | Runtime |
| Lua | Mature | Good | Stable | Runtime |
| WASM | Mature | Good | Steady (proxy-wasm) | Runtime |
| ext_proc/authz | Mature | Lower | Stable | Out-of-process |
- Dynamic Modules: Introduced in Envoy v1.34 and got stable ABI since v1.37, Dynamic Modules have reached near-maturity while continuing to evolve rapidly to provide new capabilities. They offer near-native performance without requiring custom Envoy builds. This is where Envoy extensibility is heading — and Built On Envoy makes it accessible today.
- Lua: Mature, simple, and battle-tested. Perfect for lightweight header manipulation and quick transformations. No compilation required.
- WASM: While mature and portable with sandboxed execution, the proxy-wasm specification evolves slowly compared to Dynamic Modules. This means it may lag behind emerging user needs.
The Trade-off: Native C++ filters require custom Envoy builds (operational overhead), while out-of-process
extensions like ext_proc add network latency. Runtime-loadable extensions — Dynamic Modules and Lua — offer the sweet
spot of performance and operational simplicity.
Built On Envoy focuses on these runtime-loadable mechanisms that don’t require custom builds, with strategic emphasis on the rapidly-evolving Dynamic Modules ecosystem and a developer experience centered around Rust and Go.
What is Built On Envoy
Built On Envoy consists of two components:
- The
boeCLI: A command-line tool that handles all the complexity behind the scenes: discovering extensions, downloading the right Envoy binary for your platform, generating configurations, and running everything together. - A Community Marketplace: A curated repository of ready-to-use Envoy extensions contributed by the community.
The boe CLI provides a curated developer experience to extend Envoy in Rust or Go with zero friction.
Getting started: A step-by-step guide
Install the CLI
curl -sL https://builtonenvoy.io/install.sh | sh Run your first extension
Let’s run the Coraza WAF extension and see it in action!
boe run --extension coraza-waf --config '
{
"directives": [
"Include @recommended.conf",
"SecRuleEngine On",
"SecResponseBodyAccess Off",
"Include @crs-setup.conf",
"Include @owasp_crs/*.conf"
]
}' The CLI downloads Envoy, fetches the extension, generates the configuration, and starts everything. You can immediately test it:
# Send a normal request that should be accepted by Envoy
curl -v http://localhost:10000/headers
# Send a malicious request with a SQL Injection in the payload and see the request blocked by the WAF
curl -v http://localhost:10000/post -X POST --data "1%27%20ORDER%20BY%203--%2B"
< HTTP/1.1 403 Forbidden
< content-length: 22
< content-type: text/plain
< date: Thu, 12 Feb 2026 10:28:16 GMT
< server: envoy
<
Request blocked by WAF What happened under the hood
When you run boe run --extension coraza-waf, the boe CLI:
- Reads the extension manifest from the marketplace.
- Detects your OS and architecture (Linux/macOS, x86_64/arm64).
- Downloads the correct Envoy binary if not already cached.
- Fetches the extension from the OCI registry.
- Generates the Envoy configuration based on the extension manifest and any user-provided settings.
- Starts Envoy with a test upstream so you can immediately send traffic.
No YAML to write. No build steps. You go from zero to a running Envoy in your laptop with custom extensions in seconds.
Export the configuration and compiled extensions
Once you’ve validated your setup locally, you can export the configuration for production use:
boe gen-config --output /tmp/boe-export --extension coraza-waf --config '{...}' You will get the complete Envoy configuration as well as the extensions compiled as Envoy Dynamic modules that will be ready to use outside Built On Envoy.
Creating your own extensions
Built On Envoy is designed to make writing extensions in Rust or Go a zero-friction experience.
First, scaffold a new extension project for your language of choice:
boe create --name my-extension --type rust # Scaffolds a Rust project
boe create --name my-extension --type go # Scaffolds a Go project This generates a ready-to-build project with some example code to make it easy to get started. You can immediately run the extension locally to see it in action:
boe run --local ./my-extension
The CLI detects the extension type, builds it, generates the Envoy configuration, and starts everything together. Iterate with immediate feedback!
Join the community
Built On Envoy is ready for you to explore:
- Website: builtonenvoy.io — browse extensions and documentation.
- GitHub: github.com/tetratelabs/built-on-envoy — clone, contribute, and open issues.
- Documentation: builtonenvoy.io/docs — learn how to use the CLI and write extensions.
- Slack: The #built-on-envoy channel in the Tetrate Community Slack is where the community hangs out. Ask questions, share what you’re building, report issues, or just say hello. We’re building the future of Envoy extensibility together.