BYOC Documentation

GCP BYOC Setup Guide

Deploy OptimaFlo's BYOC (Bring Your Own Cloud) agent to your Google Cloud Platform project. Your data never leaves your infrastructure.

Quick Start Overview

1
Enable APIs
Enable required GCP APIs in your project
2
Grant Permissions
Configure IAM roles for the platform service account
3
Deploy via Dashboard
Use the OptimaFlo BYOC dashboard to deploy

Step 1: Enable Required GCP APIs

The BYOC agent requires several GCP APIs to be enabled in your project. These APIs provide the foundation for Cloud Run deployment, networking, data services, and monitoring.

APIPurposeStatus
compute.googleapis.com
Compute Engine API
VPC networks, firewall rules, networkingRequired
run.googleapis.com
Cloud Run Admin API
Deploy and manage Cloud Run servicesRequired
vpcaccess.googleapis.com
Serverless VPC Access API
VPC connectors for private networkingRequired
iam.googleapis.com
IAM API
Service accounts and IAM bindingsRequired
secretmanager.googleapis.com
Secret Manager API
Store agent tokens securelyRequired
cloudresourcemanager.googleapis.com
Cloud Resource Manager API
Project-level IAM policiesRequired
storage.googleapis.com
Cloud Storage API
GCS bucket operationsRequired
bigquery.googleapis.com
BigQuery API
BigQuery dataset and table operationsRequired
composer.googleapis.com
Cloud Composer API
Managed Apache Airflow for pipeline orchestrationRequired
container.googleapis.com
Kubernetes Engine API
GKE clusters (Composer dependency)Required
sqladmin.googleapis.com
Cloud SQL Admin API
Composer metadata databaseRequired
pubsub.googleapis.com
Pub/Sub API
Composer internal messagingRequired
logging.googleapis.com
Cloud Logging API
Application loggingRequired
monitoring.googleapis.com
Cloud Monitoring API
Metrics and alertingRequired
cloudtrace.googleapis.com
Cloud Trace API
Distributed tracingRequired
iamcredentials.googleapis.com
IAM Service Account Credentials API
Service account impersonation tokensRequired
artifactregistry.googleapis.com
Artifact Registry API
Container image storageRequired
cloudbuild.googleapis.com
Cloud Build API
Infrastructure provisioning buildsRequired
servicenetworking.googleapis.com
Service Networking API
VPC peering for Cloud SQLRequired
sql-component.googleapis.com
Cloud SQL Component API
Cloud SQL internal componentsRequired
bigqueryconnection.googleapis.com
BigQuery Connection API
BigLake connections for IcebergRequired
bigquerystorage.googleapis.com
BigQuery Storage API
High-throughput BigQuery reads/writesRequired
dataproc.googleapis.com
Dataproc API
Spark clusters for large-scale processing (>10TB)Required

Enable All APIs at Once In The Command Line

bash
gcloud services enable \
  compute.googleapis.com \
  run.googleapis.com \
  vpcaccess.googleapis.com \
  servicenetworking.googleapis.com \
  iam.googleapis.com \
  iamcredentials.googleapis.com \
  secretmanager.googleapis.com \
  cloudresourcemanager.googleapis.com \
  storage.googleapis.com \
  bigquery.googleapis.com \
  bigqueryconnection.googleapis.com \
  bigquerystorage.googleapis.com \
  composer.googleapis.com \
  container.googleapis.com \
  sqladmin.googleapis.com \
  sql-component.googleapis.com \
  pubsub.googleapis.com \
  artifactregistry.googleapis.com \
  cloudbuild.googleapis.com \
  dataproc.googleapis.com \
  logging.googleapis.com \
  monitoring.googleapis.com \
  cloudtrace.googleapis.com \
  --project=YOUR_PROJECT_ID

Note: APIs take 30-60 seconds to propagate after enabling. Wait before running BYOC.

Step 2: Configure IAM Permissions

The OptimaFlo platform service account needs access to browse, preview, and write processed data (Bronze/Silver/Gold layers) to your BigQuery and GCS. Permissions follow the principle of least privilege.

IAM RoleScopePurpose
roles/bigquery.dataEditorProject
Read and write datasets, tables, and table data (Bronze/Silver/Gold layers)
bigquery.datasets.getbigquery.tables.getbigquery.tables.listbigquery.tables.getDatabigquery.tables.createbigquery.tables.updateData
roles/bigquery.jobUserProject
Execute queries for data preview and transformation
bigquery.jobs.create
roles/storage.objectViewerBucket
Read files and list objects in your data buckets
storage.objects.getstorage.objects.list
roles/run.invokerService
Invoke BYOC Cloud Run services (Polaris, Agent API)
run.routes.invoke
roles/iam.serviceAccountTokenCreatorService Account
Generate access tokens for impersonation (on Agent SA)
iam.serviceAccounts.getAccessTokeniam.serviceAccounts.signJwt
roles/composer.workerProject
Execute Airflow commands via Cloud Composer API (Agent SA)
composer.environments.executeAirflowCommandcomposer.environments.get

Grant Permissions via gcloud CLI

Optional: Dataproc for Spark Workloads

If you're processing datasets larger than 10TB, you may want to enable Dataproc for distributed Spark processing. Datasets under 100GB use DuckDB, and 100GB–10TB use BigQuery automatically.

APIPurposeStatus
dataproc.googleapis.com
Cloud Dataproc API
Create clusters, submit Spark/PySpark jobsRequired
dataproc-control.googleapis.com
Cloud Dataproc Control API
Internal control plane (auto-enabled)Optional
dataprocrm.googleapis.com
Dataproc Resource Manager API
Advanced: Dataproc on GKEOptional
metastore.googleapis.com
Dataproc Metastore API
Managed Hive Metastore for persistent catalogOptional
bash
# Enable Dataproc API (only if needed for >10TB workloads)
gcloud services enable dataproc.googleapis.com --project=YOUR_PROJECT_ID

# Optional: Enable Dataproc Metastore for persistent Hive catalog
gcloud services enable metastore.googleapis.com --project=YOUR_PROJECT_ID

Step 3: Deploy via Dashboard

Once APIs are enabled and permissions are granted, deploy the BYOC agent using the OptimaFlo dashboard. The guided wizard walks you through the deployment process.

BYOC Agent Deployment
The BYOC dashboard provides a guided wizard to deploy your agent in minutes
Automatic infrastructure provisioning
Secure credential management
Real-time deployment progress
Health monitoring built-in
Automatic IAM configuration
No Terraform knowledge required
How It Works
  1. 1Connect GCP Project — Enter your project ID and authenticate with OAuth
  2. 2Configure Agent — Select region, network settings, and data sources
  3. 3Deploy — Click deploy and watch real-time progress as your agent provisions
  4. 4Verify — The dashboard shows agent health and connectivity status

Common Issues

Ready to stop managing infrastructure?

Go from raw data to business dashboards in one conversation.

Now in early beta. Plans from $2,500/mo. Deployed in your cloud. Your data never leaves.

AI-native data platform. From raw data to business dashboards powered by Apache open standards, visual pipeline building, and AI agents that handle the heavy lifting.

© 2026 OptimaFlo. All rights reserved.

We value your privacy

We use cookies to enhance your browsing experience, serve personalized content, and analyze our traffic. By clicking "Accept All", you consent to our use of cookies. You can customize your preferences or learn more in our Cookie Policy and Privacy Policy.