BigQuery Scheduled Query คืออะไร
BigQuery Scheduled Query เป็น feature ของ Google BigQuery ที่ให้ตั้งเวลารัน SQL queries อัตโนมัติตาม schedule ที่กำหนด ใช้สำหรับ ETL pipelines, data aggregation, report generation และ data transformation โดยไม่ต้องใช้ external orchestration tools
สำหรับ AR/VR Development การวิเคราะห์ข้อมูลจาก AR/VR applications มีปริมาณมหาศาล ทั้ง user interaction data, spatial tracking data, performance metrics, session analytics และ device telemetry BigQuery Scheduled Queries ช่วย automate การประมวลผลข้อมูลเหล่านี้เป็น insights ที่ใช้ปรับปรุง AR/VR experiences
Use cases หลักได้แก่ User Behavior Analytics วิเคราะห์ว่าผู้ใช้โต้ตอบกับ AR/VR content อย่างไร, Performance Monitoring ติดตาม frame rate, latency, crash rate, Spatial Analytics วิเคราะห์ heatmaps ของ user gaze และ movement, Content Analytics วัด engagement กับ AR/VR content แต่ละชิ้น และ A/B Testing วิเคราะห์ผลการทดสอบ features ใหม่
ติดตั้งและตั้งค่า Scheduled Queries
วิธีสร้าง Scheduled Query ใน BigQuery
# === BigQuery Scheduled Query Setup ===
# 1. Install Google Cloud SDK
# curl https://sdk.cloud.google.com | bash
# gcloud init
# gcloud auth application-default login
# 2. Create Dataset for AR/VR Analytics
bq mk --dataset \
--description "AR/VR Analytics Data" \
--default_table_expiration 7776000 \
my_project:arvr_analytics
# 3. Create Raw Events Table
bq mk --table \
my_project:arvr_analytics.raw_events \
event_id:STRING, user_id:STRING, session_id:STRING,\
event_type:STRING, event_data:STRING,\
device_type:STRING, platform:STRING,\
timestamp:TIMESTAMP,\
position_x:FLOAT, position_y:FLOAT, position_z:FLOAT,\
rotation_x:FLOAT, rotation_y:FLOAT, rotation_z:FLOAT
# 4. Create Scheduled Query via bq CLI
bq query --use_legacy_sql=false \
--schedule="every 1 hours" \
--display_name="AR/VR Hourly Session Aggregation" \
--destination_table="my_project:arvr_analytics.hourly_sessions" \
--replace=true \
'
SELECT
DATE(timestamp) AS event_date,
EXTRACT(HOUR FROM timestamp) AS event_hour,
device_type,
platform,
COUNT(DISTINCT session_id) AS total_sessions,
COUNT(DISTINCT user_id) AS unique_users,
COUNT(*) AS total_events,
AVG(TIMESTAMP_DIFF(session_end, session_start, SECOND)) AS avg_session_duration_sec,
COUNTIF(event_type = "crash") AS crash_count
FROM `my_project.arvr_analytics.raw_events`
WHERE timestamp >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 2 HOUR)
GROUP BY 1, 2, 3, 4
'
# 5. Create Scheduled Query via Terraform
cat > bigquery_scheduled.tf << 'EOF'
resource "google_bigquery_data_transfer_config" "hourly_aggregation" {
display_name = "AR/VR Hourly Aggregation"
data_source_id = "scheduled_query"
schedule = "every 1 hours"
destination_dataset_id = google_bigquery_dataset.arvr.dataset_id
location = "asia-southeast1"
params = {
destination_table_name_template = "hourly_sessions_{run_date}"
write_disposition = "WRITE_TRUNCATE"
query = file("queries/hourly_aggregation.sql")
}
}
EOF
echo "BigQuery scheduled queries configured"
สร้าง Data Pipeline สำหรับ AR/VR Analytics
Pipeline สำหรับวิเคราะห์ AR/VR data
-- === AR/VR Analytics Scheduled Queries ===
-- 1. Daily User Engagement Summary
-- Schedule: every day 02:00 UTC
-- ===================================
CREATE OR REPLACE TABLE `arvr_analytics.daily_engagement` AS
SELECT
DATE(timestamp) AS event_date,
platform,
device_type,
-- User Metrics
COUNT(DISTINCT user_id) AS daily_active_users,
COUNT(DISTINCT session_id) AS total_sessions,
COUNT(*) AS total_events,
-- Session Metrics
AVG(session_duration_sec) AS avg_session_duration,
APPROX_QUANTILES(session_duration_sec, 100)[OFFSET(50)] AS median_session_duration,
APPROX_QUANTILES(session_duration_sec, 100)[OFFSET(95)] AS p95_session_duration,
-- Interaction Metrics
COUNTIF(event_type = 'object_interact') AS object_interactions,
COUNTIF(event_type = 'menu_open') AS menu_opens,
COUNTIF(event_type = 'teleport') AS teleport_count,
COUNTIF(event_type = 'grab') AS grab_count,
-- Performance Metrics
AVG(CAST(JSON_EXTRACT_SCALAR(event_data, '$.fps') AS FLOAT64)) AS avg_fps,
AVG(CAST(JSON_EXTRACT_SCALAR(event_data, '$.latency_ms') AS FLOAT64)) AS avg_latency_ms,
COUNTIF(event_type = 'crash') AS crash_count,
-- Retention
COUNT(DISTINCT CASE
WHEN user_id IN (
SELECT DISTINCT user_id
FROM `arvr_analytics.raw_events`
WHERE DATE(timestamp) = DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY)
) THEN user_id
END) AS returning_users
FROM `arvr_analytics.raw_events`
WHERE DATE(timestamp) = CURRENT_DATE()
GROUP BY 1, 2, 3;
-- 2. Spatial Heatmap Data
-- Schedule: every 6 hours
-- ===================================
CREATE OR REPLACE TABLE `arvr_analytics.spatial_heatmap` AS
SELECT
DATE(timestamp) AS event_date,
ROUND(position_x, 1) AS grid_x,
ROUND(position_y, 1) AS grid_y,
ROUND(position_z, 1) AS grid_z,
COUNT(*) AS visit_count,
COUNT(DISTINCT user_id) AS unique_visitors,
AVG(TIMESTAMP_DIFF(
LEAD(timestamp) OVER (PARTITION BY session_id ORDER BY timestamp),
timestamp, SECOND
)) AS avg_dwell_time_sec,
-- Gaze direction analysis
AVG(rotation_x) AS avg_gaze_pitch,
AVG(rotation_y) AS avg_gaze_yaw,
COUNTIF(event_type = 'object_interact') AS interactions_in_area
FROM `arvr_analytics.raw_events`
WHERE DATE(timestamp) >= DATE_SUB(CURRENT_DATE(), INTERVAL 7 DAY)
AND position_x IS NOT NULL
GROUP BY 1, 2, 3, 4
HAVING visit_count >= 5;
-- 3. Content Performance Analysis
-- Schedule: every day 03:00 UTC
-- ===================================
CREATE OR REPLACE TABLE `arvr_analytics.content_performance` AS
SELECT
JSON_EXTRACT_SCALAR(event_data, '$.content_id') AS content_id,
JSON_EXTRACT_SCALAR(event_data, '$.content_type') AS content_type,
DATE(timestamp) AS event_date,
COUNT(DISTINCT user_id) AS viewers,
COUNT(DISTINCT session_id) AS view_sessions,
AVG(CAST(JSON_EXTRACT_SCALAR(event_data, '$.view_duration_sec') AS FLOAT64)) AS avg_view_duration,
COUNTIF(event_type = 'content_complete') AS completions,
SAFE_DIVIDE(
COUNTIF(event_type = 'content_complete'),
COUNT(DISTINCT session_id)
) AS completion_rate,
-- Interaction depth
AVG(CAST(JSON_EXTRACT_SCALAR(event_data, '$.interaction_count') AS INT64)) AS avg_interactions,
-- Rating
AVG(CAST(JSON_EXTRACT_SCALAR(event_data, '$.rating') AS FLOAT64)) AS avg_rating
FROM `arvr_analytics.raw_events`
WHERE event_type IN ('content_view', 'content_interact', 'content_complete')
AND DATE(timestamp) = CURRENT_DATE()
GROUP BY 1, 2, 3;
Query Optimization สำหรับ AR/VR Data
Optimize queries สำหรับ AR/VR data ขนาดใหญ่
-- === BigQuery Optimization for AR/VR ===
-- 1. Partitioning and Clustering
-- ===================================
CREATE TABLE `arvr_analytics.events_optimized`
PARTITION BY DATE(timestamp)
CLUSTER BY device_type, event_type, user_id
AS
SELECT * FROM `arvr_analytics.raw_events`;
-- Benefits:
-- Partition pruning: query เฉพาะวันที่ต้องการ (ลด scan 90%+)
-- Clustering: query filter ด้วย device_type, event_type เร็วขึ้น
-- Cost: scan น้อยลง = จ่ายน้อยลง
-- 2. Materialized Views
-- ===================================
CREATE MATERIALIZED VIEW `arvr_analytics.mv_daily_stats`
PARTITION BY event_date
CLUSTER BY platform
AS
SELECT
DATE(timestamp) AS event_date,
platform,
device_type,
COUNT(DISTINCT user_id) AS dau,
COUNT(DISTINCT session_id) AS sessions,
COUNT(*) AS events,
AVG(CAST(JSON_EXTRACT_SCALAR(event_data, '$.fps') AS FLOAT64)) AS avg_fps
FROM `arvr_analytics.events_optimized`
GROUP BY 1, 2, 3;
-- Materialized view auto-refreshes
-- Queries on mv_daily_stats are instant (pre-computed)
-- 3. Approximate Aggregation Functions
-- ===================================
-- Use APPROX_COUNT_DISTINCT instead of COUNT(DISTINCT)
-- for large datasets (1-2% error, 10x faster)
SELECT
DATE(timestamp) AS event_date,
APPROX_COUNT_DISTINCT(user_id) AS approx_dau,
APPROX_COUNT_DISTINCT(session_id) AS approx_sessions,
APPROX_QUANTILES(
CAST(JSON_EXTRACT_SCALAR(event_data, '$.fps') AS FLOAT64),
100
)[OFFSET(50)] AS median_fps,
HLL_COUNT.INIT(user_id) AS user_hll_sketch
FROM `arvr_analytics.events_optimized`
WHERE DATE(timestamp) = CURRENT_DATE()
GROUP BY 1;
-- 4. Nested and Repeated Fields
-- ===================================
-- Store AR/VR events with nested structures
-- instead of JSON strings for better performance
CREATE TABLE `arvr_analytics.events_nested` (
event_id STRING,
user_id STRING,
session_id STRING,
timestamp TIMESTAMP,
event_type STRING,
device STRUCT<
type STRING,
platform STRING,
model STRING,
os_version STRING
>,
position STRUCT<
x FLOAT64,
y FLOAT64,
z FLOAT64
>,
rotation STRUCT<
pitch FLOAT64,
yaw FLOAT64,
roll FLOAT64
>,
performance STRUCT<
fps FLOAT64,
latency_ms FLOAT64,
memory_mb FLOAT64
>,
interactions ARRAY>
)
PARTITION BY DATE(timestamp)
CLUSTER BY event_type, user_id;
-- Query nested fields directly (faster than JSON_EXTRACT)
SELECT
device.type,
AVG(performance.fps) AS avg_fps,
AVG(performance.latency_ms) AS avg_latency
FROM `arvr_analytics.events_nested`
WHERE DATE(timestamp) = CURRENT_DATE()
GROUP BY 1;
Dashboard และ Reporting
สร้าง dashboard สำหรับ AR/VR analytics
#!/usr/bin/env python3
# arvr_dashboard.py — AR/VR Analytics Dashboard Data
import json
import logging
from datetime import datetime, timedelta
from typing import Dict, List
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger("dashboard")
class ARVRDashboard:
def __init__(self):
self.metrics = {}
def generate_overview(self):
return {
"period": "last_7_days",
"kpis": {
"daily_active_users": 12500,
"avg_session_duration_min": 18.5,
"total_sessions": 87500,
"avg_fps": 72.3,
"crash_rate_pct": 0.8,
"retention_d1_pct": 45.2,
"retention_d7_pct": 22.1,
},
"trends": {
"dau_change_pct": 12.5,
"session_duration_change_pct": 8.3,
"crash_rate_change_pct": -15.2,
"fps_change_pct": 5.1,
},
"top_content": [
{"id": "scene_01", "viewers": 8500, "avg_duration_min": 12.3, "rating": 4.5},
{"id": "scene_02", "viewers": 6200, "avg_duration_min": 15.7, "rating": 4.2},
{"id": "scene_03", "viewers": 5100, "avg_duration_min": 8.9, "rating": 4.7},
],
"device_breakdown": {
"meta_quest_3": {"users": 5500, "pct": 44},
"apple_vision_pro": {"users": 2500, "pct": 20},
"pico_4": {"users": 1875, "pct": 15},
"mobile_ar": {"users": 2625, "pct": 21},
},
}
def performance_report(self):
return {
"fps_distribution": {
"0-30": 2.5,
"30-60": 15.3,
"60-72": 35.2,
"72-90": 42.1,
"90+": 4.9,
},
"latency_percentiles": {
"p50": 12,
"p75": 18,
"p90": 28,
"p95": 45,
"p99": 120,
},
"crash_by_device": {
"meta_quest_3": 0.5,
"apple_vision_pro": 0.3,
"pico_4": 1.2,
"mobile_ar": 1.5,
},
"recommendations": [
"Optimize scene_03 for Pico 4 (high crash rate)",
"Reduce texture quality on mobile AR devices",
"Implement LOD system for scenes with >50 objects",
],
}
dashboard = ARVRDashboard()
overview = dashboard.generate_overview()
print("Overview:", json.dumps(overview, indent=2))
perf = dashboard.performance_report()
print("\nPerformance:", json.dumps(perf, indent=2))
Monitoring และ Cost Management
Monitor scheduled queries และจัดการค่าใช้จ่าย
# === Monitoring & Cost Management ===
# 1. Monitor Scheduled Query Status
# ===================================
# bq ls --transfer_config --transfer_location=asia-southeast1
# Check run history
# bq ls --transfer_run --transfer_location=asia-southeast1 \
# projects/my-project/locations/asia-southeast1/transferConfigs/xxxxx
# 2. Cost Monitoring Query
# ===================================
# Run this weekly to track BigQuery costs:
#
# SELECT
# DATE(creation_time) AS query_date,
# user_email,
# COUNT(*) AS query_count,
# ROUND(SUM(total_bytes_processed) / POW(10,12), 2) AS total_tb_processed,
# ROUND(SUM(total_bytes_processed) / POW(10,12) * 6.25, 2) AS estimated_cost_usd
# FROM `region-asia-southeast1`.INFORMATION_SCHEMA.JOBS
# WHERE DATE(creation_time) >= DATE_SUB(CURRENT_DATE(), INTERVAL 7 DAY)
# GROUP BY 1, 2
# ORDER BY estimated_cost_usd DESC;
# 3. Cost Optimization Tips
# ===================================
# - Use partitioned tables (reduce scan volume)
# - Use clustered tables (further reduce scan)
# - Set table expiration for temporary data
# - Use materialized views for repeated queries
# - Use APPROX functions for large datasets
# - Preview queries before running (check bytes scanned)
# - Use BI Engine for dashboard queries ($0/month for 1GB cache)
# 4. Alert on Query Failures
# ===================================
# Cloud Monitoring alert policy:
# gcloud monitoring policies create \
# --display-name="BigQuery Scheduled Query Failed" \
# --condition-display-name="Transfer Run Failed" \
# --condition-filter='resource.type="bigquery_dts_config" AND metric.type="bigquery.googleapis.com/transfer/run_count" AND metric.labels.state="FAILED"' \
# --notification-channels="projects/my-project/notificationChannels/xxx"
# 5. Slot Reservation for Predictable Costs
# ===================================
# Flat-rate pricing (instead of on-demand):
# - 100 slots = $2,000/month (predictable cost)
# - Good if processing > 320 TB/month
# - AR/VR data can be 10-100+ TB/month
#
# bq mk --reservation \
# --project_id=my-project \
# --location=asia-southeast1 \
# --slots=100 \
# arvr_reservation
echo "Monitoring and cost management configured"
FAQ คำถามที่พบบ่อย
Q: BigQuery Scheduled Query กับ Airflow ต่างกันอย่างไร?
A: BigQuery Scheduled Query เป็น built-in feature ไม่ต้องตั้ง infrastructure เพิ่ม เหมาะสำหรับ simple SQL-based ETL ที่รันตาม schedule ข้อจำกัดทำได้แค่ run SQL queries ไม่มี branching, error handling ซับซ้อน, dependencies ระหว่าง queries Airflow เป็น full orchestration platform รองรับ complex DAGs, dependencies, branching, retry, multiple data sources เหมาะสำหรับ complex ML pipelines เริ่มจาก Scheduled Query ถ้าเกินความสามารถค่อยย้ายไป Airflow
Q: AR/VR data มีขนาดเท่าไหร?
A: ขึ้นกับ tracking frequency และ data richness Basic events (clicks, sessions) ประมาณ 1-5 KB/event, Spatial tracking (position, rotation ที่ 30Hz) ประมาณ 50-100 KB/second/user, Eye tracking data ประมาณ 200-500 KB/second/user สำหรับ app ที่มี 10,000 DAU session เฉลี่ย 15 นาที basic events ประมาณ 5-10 GB/day spatial tracking ประมาณ 500 GB - 1 TB/day ใช้ partitioning และ compression ใน BigQuery ลดค่าใช้จ่ายได้มาก
Q: ควรเก็บ AR/VR data ละเอียดแค่ไหน?
A: แยกเป็น tiers Raw data เก็บทุก event ที่ resolution สูงสุด (30-90Hz tracking) เก็บ 30 วัน Aggregated data สรุปเป็น per-second หรือ per-minute เก็บ 90 วัน Summary data สรุปเป็น daily/hourly เก็บ 1+ ปี ใช้ BigQuery table expiration จัดการอัตโนมัติ raw data อาจ archive ไป Cloud Storage (Coldline) สำหรับ compliance
Q: BigQuery ค่าใช้จ่ายสำหรับ AR/VR analytics เท่าไหร?
A: On-demand pricing $6.25/TB scanned สำหรับ 10,000 DAU app ที่มี 1 TB/day raw data ถ้า query scan 10% ของ data (partitioned) ประมาณ $6.25/day = $190/month Storage $0.02/GB/month (active) = 30 TB active storage ประมาณ $600/month Flat-rate 100 slots $2,000/month เหมาะกว่าถ้า scan มากกว่า 320 TB/month รวมประมาณ $500-3,000/month ขึ้นกับ data volume และ query frequency
