Adapters
Import semantic models from Cube, MetricFlow, LookML, Hex, Rill, Superset, Omni, and BSL into Sidemantic
Sidemantic can import semantic models from other popular semantic layer formats, letting you use your existing metric definitions with Sidemantic's query engine and features.
Supported Formats
| Format | Import | Notes |
|---|---|---|
| Sidemantic (native) | ✅ | Full feature support |
| Cube | ✅ | No native segments |
| MetricFlow (dbt) | ✅ | No native segments or hierarchies |
| LookML (Looker) | ✅ | Liquid templating (not Jinja) |
| Hex | ✅ | No segments or cross-model derived metrics |
| Rill | ✅ | No relationships, segments, or cross-model metrics; single-model only |
| Superset (Apache) | ✅ | No relationships in datasets |
| Omni | ✅ | Relationships in separate model file |
| BSL (Boring Semantic Layer) | ✅ | Ibis-style expressions; supports roundtrip export |
Feature Compatibility
This table shows which Sidemantic features are supported when importing from other formats:
| Feature | Sidemantic | Cube | MetricFlow | LookML | Hex | Rill | Superset | Omni | BSL | Notes |
|---|---|---|---|---|---|---|---|---|---|---|
| Models | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | All formats support models/tables |
| Dimensions | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | All formats support dimensions |
| Simple Metrics | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | All formats support sum, count, avg, min, max |
| Time Dimensions | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | All formats support time dimensions with granularity |
| Relationships | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ | ❌ | ✅ | ✅ | Rill/Superset: single-model only; Omni: in model file |
| Derived Metrics | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | All formats support calculated metrics |
| Metric Filters | ✅ | ✅ | ❌ | ✅ | ✅ | ⚠️ | ❌ | ✅ | ❌ | Rill has basic support; Superset/BSL lack filters |
| Ratio Metrics | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ | ❌ | ✅ | ❌ | Rill/Superset/BSL don't have native ratio metric type |
| Segments | ✅ | ✅ | ❌ | ✅ | ❌ | ❌ | ❌ | ❌ | ❌ | Only Cube and LookML have native segment support |
| Cumulative Metrics | ✅ | ✅ | ✅ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | Cube has rolling_window; MetricFlow has cumulative; others lack native support |
| Time Comparison | ✅ | ❌ | ✅ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | Only MetricFlow has native time comparison metrics |
| Jinja Templates | ✅ | ✅ | ✅ | ⚠️ | ✅ | ✅ | ✅ | ✅ | ❌ | LookML uses Liquid templating; BSL uses Ibis |
| Hierarchies | ✅ | ⚠️ | ❌ | ⚠️ | ❌ | ❌ | ❌ | ⚠️ | ❌ | Cube/LookML/Omni: via drill_fields |
| Inheritance | ✅ | ❌ | ❌ | ✅ | ❌ | ❌ | ❌ | ❌ | ❌ | Only LookML has native extends support |
| Metadata Fields | ✅ | ⚠️ | ⚠️ | ⚠️ | ⚠️ | ⚠️ | ✅ | ✅ | ✅ | Label and description support varies by format |
| Parameters | ✅ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | Sidemantic-only feature |
| Ungrouped Queries | ✅ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | Sidemantic-only feature |
Legend:
- ✅ Full support - feature fully supported on import
- ⚠️ Partial support - feature works with limitations
- ❌ Not supported - feature not available in source format
Importing into Sidemantic
Quick Start: Auto-Discovery
The easiest way to load semantic models from any format:
from sidemantic import SemanticLayer, load_from_directory
# Point at a directory with mixed formats
layer = SemanticLayer(connection="duckdb:///data.db")
load_from_directory(layer, "semantic_models/")
# That's it! Automatically:
# - Discovers all .lkml, .yml, .yaml files
# - Detects format (Cube, Hex, LookML, MetricFlow, etc.)
# - Parses with the right adapter
# - Infers relationships from foreign key naming
# - Builds the join graph
How Relationship Inference Works
load_from_directory() automatically infers relationships based on foreign key naming conventions:
orders.customer_id→customers.id(many-to-one)line_items.order_id→orders.id(many-to-one)products.category_id→categories.id(many-to-one)
It tries both singular and plural forms, so customer_id will match both customer and customers tables.
Reverse relationships (one-to-many) are automatically added to the target model.
Manual Adapter Usage
For more control over the import process, you can use adapters directly:
From Cube
Read Cube.js semantic models into Sidemantic:
from sidemantic.adapters.cube import CubeAdapter
# Import from Cube YAML
adapter = CubeAdapter()
graph = adapter.parse("cube/schema/Orders.yml")
# Query with Sidemantic
from sidemantic import SemanticLayer
layer = SemanticLayer(graph=graph)
result = layer.sql("SELECT revenue FROM orders")
From MetricFlow
Read dbt MetricFlow models into Sidemantic:
from sidemantic.adapters.metricflow import MetricFlowAdapter
# Import from MetricFlow YAML
adapter = MetricFlowAdapter()
graph = adapter.parse("models/metrics/") # Directory of YAML files
# Query with Sidemantic
layer = SemanticLayer(graph=graph)
result = layer.sql("SELECT revenue FROM orders")
From LookML
Read Looker LookML views into Sidemantic:
from sidemantic.adapters.lookml import LookMLAdapter
# Import from LookML
adapter = LookMLAdapter()
graph = adapter.parse("views/orders.lkml") # Single file or directory
# Query with Sidemantic
from sidemantic import SemanticLayer
layer = SemanticLayer(graph=graph)
result = layer.sql("SELECT revenue FROM orders")
From Hex
Read Hex semantic models into Sidemantic:
from sidemantic.adapters.hex import HexAdapter
# Import from Hex YAML
adapter = HexAdapter()
graph = adapter.parse("hex/models/") # Directory of YAML files
# Query with Sidemantic
from sidemantic import SemanticLayer
layer = SemanticLayer(graph=graph)
result = layer.sql("SELECT revenue FROM orders")
From Rill
Read Rill metrics views into Sidemantic:
from sidemantic.adapters.rill import RillAdapter
# Import from Rill YAML
adapter = RillAdapter()
graph = adapter.parse("rill/metrics/") # Directory of YAML files
# Query with Sidemantic
from sidemantic import SemanticLayer
layer = SemanticLayer()
layer.graph = graph
result = layer.compile(metrics=["orders.revenue"])
From Superset
Read Apache Superset datasets into Sidemantic:
from sidemantic.adapters.superset import SupersetAdapter
# Import from Superset YAML
adapter = SupersetAdapter()
graph = adapter.parse("superset/datasets/") # Directory of YAML files
# Query with Sidemantic
from sidemantic import SemanticLayer
layer = SemanticLayer(graph=graph)
result = layer.sql("SELECT total_revenue FROM orders")
From Omni
Read Omni Analytics views into Sidemantic:
from sidemantic.adapters.omni import OmniAdapter
# Import from Omni YAML views
adapter = OmniAdapter()
graph = adapter.parse("omni/") # Directory with views/ subdirectory and model.yaml
# Query with Sidemantic
from sidemantic import SemanticLayer
layer = SemanticLayer(graph=graph)
result = layer.sql("SELECT total_revenue FROM orders")
From BSL
Read Boring Semantic Layer models into Sidemantic:
from sidemantic.adapters.bsl import BSLAdapter
# Import from BSL YAML
adapter = BSLAdapter()
graph = adapter.parse("bsl/models/") # Directory of YAML files
# Query with Sidemantic
from sidemantic import SemanticLayer
layer = SemanticLayer(graph=graph)
result = layer.sql("SELECT revenue FROM orders")
# Export back to BSL (roundtrip supported)
adapter.export(graph, "output/bsl_models.yml")
Import Mapping
These sections describe how each format's concepts map to Sidemantic when importing.
Cube
cubes→modelsdimensions→dimensionsmeasures→metricsjoins→relationships(inferred from join definitions)${CUBE}placeholder →{model}placeholdersegments→segments(native support)- Calculated measures (type=number) → derived metrics
rolling_window→ cumulative metrics
MetricFlow
semantic_models→modelsentities→ inferredrelationshipsdimensions→dimensionsmeasures→ model-levelmetricsmetrics(graph-level) → graph-levelmetrics- Segments/hierarchies from
metafield → preserved
LookML
views→modelsexplores→relationships(parsed from join definitions)dimensions→dimensionsdimension_group→ multiple time dimensions (one per timeframe)measures→metricsfilters(view-level) →segmentsderived_table→ model with SQL${TABLE}placeholder →{model}placeholder- Measure filters parsed from
filters__all - Foreign keys extracted from
sql_onin explore joins
Hex
- Model
idandbase_sql_table/base_sql_query→models dimensionswithexpr_sqlorexpr_calc→dimensionsmeasureswithfunc/func_sql/func_calc→metricsrelationswithjoin_sql→relationships- Measure
filters(inline or referenced) → metric filters unique: truedimensions → primary key detectiontimestamp_tz/timestamp_naive/datetypes → time dimensions
Rill
metrics_view(type) →modelsdimensionswithcolumn/expression→dimensionsmeasureswithexpression→metricstimeseriescolumn → time dimensionsmallest_time_grain→ time dimension granularity- Derived measures (
type: derived) → derived metrics - Simple aggregation expressions parsed with sqlglot
Superset
table_name→ modelnameschema+table_name→ modeltablesql→ modelsql(for virtual datasets)columns→dimensionsmetrics→ modelmetricsmain_dttm_col→ time dimension detectionverbose_name→labelfieldis_dttmflag → time dimension typemetric_type→ aggregation mapping (count, sum, avg, etc.)
Omni
name(view) → modelnameschema+table_name→ modeltablesql→ modelsql(for SQL-based views)dimensions→dimensionsmeasureswithaggregate_type→metricstimeframes→ time dimension granularitylabel→ modeldescription(if no description field)${TABLE}placeholder →{model}placeholder${view.field}references → simplified field references- Measure
filters→ metric filters relationships(from model.yaml) → model relationships
BSL (Boring Semantic Layer)
- Model keys (top-level YAML) →
models table→ modeltabledimensions→dimensionsmeasures→metricsjoins→relationships_.columnexpression →sql: "column"_.column.sum()→agg: sum, sql: "column"_.column.mean()→agg: avg, sql: "column"_.column.nunique()→agg: count_distinct, sql: "column"_.count()→agg: count_.column.year()→EXTRACT(YEAR FROM column)is_time_dimension: true→type: "time"smallest_time_grain: "TIME_GRAIN_DAY"→granularity: "day"is_entity: true→ primary key detection- Calc measures (referencing other measures) → derived metrics
Validating Imports
Always validate after importing:
# Import
graph = adapter.parse("source.yml")
# Verify models loaded
print(f"Loaded {len(graph.models)} models")
for name, model in graph.models.items():
print(f" {name}: {len(model.metrics)} metrics, {len(model.dimensions)} dimensions")
# Verify metrics
print(f"Loaded {len(graph.metrics)} graph-level metrics")
# Test query generation
layer = SemanticLayer(graph=graph)
sql = layer.compile(metrics=["orders.revenue"])
print("Generated SQL:", sql)
Getting Help
If you encounter issues with format conversion:
- Check the compatibility table for known limitations
- Validate your source format is correctly structured
- Test with a simple model first before converting complex definitions
- File an issue at github.com/sidequery/sidemantic with:
- Source format and file
- Expected vs actual behavior
- Generated SQL or error messages