
BigQuery ML: Predictions & Deployment
How to get answers. Using ML.PREDICT, ML.EXPLAIN_PREDICT, and exporting BQML models to Vertex AI for online serving.
Batch Inference with SQL
Once a model is trained, you need to use it. In BigQuery, this is called Batch Prediction.
1. ML.PREDICT
The ML.PREDICT function applies the model to rows.
SELECT
predicted_label,
predicted_label_probs,
features.*
FROM
ML.PREDICT(MODEL `project.dataset.churn_model`,
(SELECT * FROM `project.dataset.customers_today`))
Key Output Columns:
predicted_label: The class (e.g., "True").predicted_label_probs: Array of probabilities[{label: "True", prob: 0.8}, {label: "False", prob: 0.2}].
2. ML.EXPLAIN_PREDICT
The exam asks about Explainability. "Why did BQML predict this user will churn?"
Use ML.EXPLAIN_PREDICT.
It returns the top k features that contributed to the score (using Shapley values for trees, or gradients for DNNs).
SELECT
*
FROM
ML.EXPLAIN_PREDICT(MODEL `project.dataset.churn_model`,
(SELECT * FROM `project.dataset.new_data`),
STRUCT(3 as top_k_features)) -- Show top 3 reasons
3. Exporting for Online Serving
BigQuery inference is Batch (seconds/minutes). What if you need Online inference (milliseconds) for a mobile app?
The Workflow:
- Train in BQML.
- Use
bq extractto dump the model to Google Cloud Storage (GCS). - Import the model into Vertex AI Model Registry.
- Deploy to an Endpoint.
# Export BQML model to TF SavedModel format
bq extract -m dataset.model gs://my-bucket/model_dir
Compatible Types:
- Only specific BQML models can be exported (Linear, Logistic, K-Means, XGBoost, DNN).
- Limitation: ARIMA (Time Series) usually cannot be exported for online prediction in the same way (it's inherently a batch forecasting tool).
4. Summary
ML.PREDICTis for batch scoring inside BigQuery.ML.EXPLAIN_PREDICTgives feature attributions.bq extractallows you to move the "Brain" from the Database to the API Layer (Vertex AI).
Knowledge Check
Error: Quiz options are missing or invalid.