sqlode/query_analyzer/param_inferencer
Values
pub fn extract_int_context_params(
tokens: List(lexer.Token),
engine: model.Engine,
) -> dict.Dict(Int, model.ScalarType)
Extract placeholder type hints from SQL contexts that the spec
requires to be integer: the expressions following LIMIT and
OFFSET. Each placeholder reachable from those keywords (until a
terminating keyword or end of statement) is pinned to IntType.
Used as a complement to extract_type_casts so callers can write
LIMIT sqlode.arg(lim) / OFFSET sqlode.arg(off) without an
explicit cast on engines like SQLite that cannot infer the type
from a bare ? (Issue #491).
Implementation note: this works at the token level rather than the
IR, because the IR’s Param.index is 0 for SQLite anonymous ?
placeholders (expr_parser.decode_placeholder deliberately
returns 0 for them). The token walker assigns each placeholder
an occurrence index that matches what placeholder.build
computes, so the resulting dict keys line up with
PlaceholderOccurrence.index consumed by build_params.
pub fn extract_type_casts(
ctx: context.AnalyzerContext,
engine: model.Engine,
tokens: List(lexer.Token),
) -> Result(dict.Dict(Int, model.ScalarType), #(Int, String))
pub fn infer_equality_params(
ctx: context.AnalyzerContext,
engine: model.Engine,
query_name: String,
tokens: List(lexer.Token),
catalog: model.Catalog,
) -> Result(List(#(Int, model.Column)), context.AnalysisError)
pub fn infer_in_params(
ctx: context.AnalyzerContext,
engine: model.Engine,
query_name: String,
tokens: List(lexer.Token),
catalog: model.Catalog,
) -> Result(List(#(Int, model.Column)), context.AnalysisError)
pub fn infer_insert_params(
ctx: context.AnalyzerContext,
engine: model.Engine,
tokens: List(lexer.Token),
catalog: model.Catalog,
) -> List(#(Int, model.Column))
pub fn infer_insert_params_from_ir(
engine: model.Engine,
statement: query_ir.SqlStatement,
catalog: model.Catalog,
) -> List(#(Int, model.Column))
Structured IR variant of infer_insert_params. Consumes the
pre-parsed InsertStatement directly instead of re-scanning
the token list.