Your business logic,
deterministically executable by AI.
FeatureMesh is a registry of named, typed, composable business logic building blocks
that humans write and AI agents compose. Every dashboard, every production API,
and every LLM query uses the same definitions and gets the same answers.
The problem with semantic layers + LLMs
Semantic layers let you declare metrics in YAML and expose them to LLMs through MCP.
This looks like a solution. It isn't.
YAML is configuration, not a language.
You can declare revenue = sum(amount) but you can't compose it. You can't say, yes take that but "remove refunds, and do not count shipping for Italy" for my query. It needs to be defined as a different metric. Composition stops where the YAML ends, and the LLM fills the gap by writing raw SQL against your tables, which defeats the purpose.
English is ambiguous. SQL is expensive.
When the LLM falls back to writing SQL, you pay the cost twice. First in tokens: the LLM needs the full schema in context to generate one query. A 100 tables warehouse is 50k+ tokens per question. Second in review: a human has to read and verify a 200 lines SQL query to trust the answer. Nobody does this in practice.
LLM progress will not fix this.
Even a perfect LLM cannot resolve ambiguity that exists in the request. "Active customers" means seven different things in most companies. Without a canonical definition, the LLM guesses from different bits of english coming from md files. The guess can silently change across prompts, schema changes, and model updates."
Evaluation becomes the bottleneck.
So every team running LLMs on data ends up with the same homework: build eval harnesses, grade responses, debug regressions. This work of always catching up does not scale.
What a language gives you
that configuration cannot
FeatureQL is a functional dataflow language.
Every feature is a named, typed, composable pure function.
Composition
Every feature can reference every other feature. customer_ltv can build on order_total, which builds on order_items_price. Deep composition is the unit of reuse, not flat metric declarations.
Entity types
BIGINT#CUSTOMERS is not the same type as BIGINT#ORDERS. An LLM cannot accidentally join two unrelated tables because the type system rejects it at compile time.
Deterministic execution
A feature definition expresses a universal business intent and transpiles to executable SQL on your warehouse of choice (DuckDB, Trino, BigQuery). Same feature, same answer, every backend.
Small context
An LLM asking a question doesn't need the warehouse schema or outdated markdown docs. It just needs to identify the right features reading their formula and compose them. No overblown context window. No explosion of cost.
Human verifiable
Reviewing "does customer_ltv match the business definition" is a one time, one place conversation. Reviewing a 200 lines generated SQL query every time an agent answers a question is not.
Variants without duplication
Want to test a different discount threshold, a new churn model, or a country specific rule? VARIANT() swaps one dependency in any feature without touching the original. Run the old and the new side by side, compare the answers, ship the winner. No branching, no copy paste, no drift.
Really universal: analytics and serving
from the same definitions
Most "semantic layers for AI" only cover analytics. FeatureMesh covers both.
Analytics
FeatureQL transpiles to SQL and runs on your warehouse (DuckDB, Trino, BigQuery). Dashboards, BI tools, ad hoc queries, LLM analytical questions.
Serving
The same features run on DataFusion for millisecond real time inference. Connect Redis, JDBC, HTTP sources. Compile to prepared statements. Serve at production latency.
This matters for the business. A lot.
A reactive agent analyzes history and finds what to change next.
A proactive agent tests the change in production: eligibility, pricing, fraud, personalization...
It used to take weeks. With FeatureMesh, it's one definition update away.
How it actually works
1. Analytics / Training
Define entities and keys
That's the foundation of semantics.
CREATE FEATURES AS SELECT customers := entity(), customer_id := input(bigint#customers), orders := entity(), order_id := input(bigint#orders), ;
Map features to columns
Each column in your analytics database becomes a source feature.
CREATE FEATURES IN fm.core AS SELECT customer_data := external_columns( customer_id bigint#customers bind to customer_id, orders array(row( order_id bigint#orders, date_create timestamp, price decimal(10,2) )) from table(customer_history) ), customer_ml_batch := external_columns( customer_id bigint#customers bind to customer_id, churn_risk float from table(customer_ml_batch) ), ;
Write transformations
Express your business logic as declarative feature transformations.
CREATE FEATURES IN fm.marketing AS SELECT has_recent_orders := date_diff( array_max(fm.core.customer_data[orders][date_create]), date_ref::timestamp, 'day' ) < 30, churn_risk := fm.core.customer_ml_batch[churn_risk], lifetime_value := array_sum(fm.core.customer_data[orders][price]), show_promocode_offline := not has_recent_orders and lifetime_value > 1000.00 and churn_risk > 0.8e0, ;
Compute features in batch
Apply the transformation to any set of inputs, hybrid with SQL.
/* SQL */ SELECT show_promocode_offline, COUNT(1) as num_customers FROM FEATUREQL( SELECT customer_id := bind_sql(SELECT customer_id FROM customers), show_promocode_offline ) GROUP BY show_promocode_offline
2. Real-time / Serving
Define online sources
Connect your databases, APIs, or services. Each becomes a source feature.
CREATE FEATURES IN fm.online AS SELECT customer_source := source_jdbc( 'postgres://user:secure@acme.eu-west-1.rds.amazonaws.com:5432/orders', array[customer_history] ), ml_endpoint := source_http( 'https://ml.acme.com/churn-v3/predict' with (method='POST') ), ;
Re-use features
Apply features defined for analytics to your online sources.
CREATE FEATURES IN fm.online AS SELECT customer_data := external_columns( customer_id bigint#customers bind to customer_id, orders array(row( order_id bigint#orders, date_create timestamp, price decimal(10,2) )) from view(fm.online.customer_source[customer_history]) ), has_recent_orders := variant( fm.marketing.has_recent_orders replacing fm.core.customer_data with customer_data ), has_recent_orders := variant( fm.marketing.has_recent_orders replacing fm.core.customer_data with customer_data ), lifetime_value := variant( fm.marketing.lifetime_value replacing fm.core.customer_data with customer_data ), ml_data := external_http( from fm.online.ml_endpoint with ( query_params=row(has_recent_orders, lifetime_value) ) as row(churn_risk float) ), churn_risk := ml_data[churn_risk], ; CREATE FEATURE fm.marketing.show_promocode_online AS SELECT variant( fm.marketing.show_promocode_offline replacing fm.core.customer_data, fm.marketing.churn_risk with fm.online.customer_data, fm.online.churn_risk ) ;
Compile as prepared statement
For superfast serving
CREATE FEATURE fm.marketing.show_promocode_online_ps AS prepared_statement( fm.marketing.show_promocode_online using customer_id ) ;
Integrate anywhere
Evaluation of the prepared statement is just an API call away.
POST /api/evaluate
Content-Type: application/json
{
"id": "fm.marketing.show_promocode_online_ps",
"inputs": [
["5423567855"],
]
}Who this is for
FeatureMesh is designed for teams where:
Business logic is high value and reused across systems.
Experimentation is a key part of the business.
AI agents are in production or about to be, and unstable behavior is unacceptable.