Google Vertex AI
Google Vertex AI ↗ enables developers to easily build and deploy enterprise ready generative AI experiences.
Below is a quick guide on how to set your Google Cloud Account:
-
Google Cloud Platform (GCP) Account
- Sign up for a GCP account ↗. New users may be eligible for credits (valid for 90 days).
-
Enable the Vertex AI API
- Navigate to Enable Vertex AI API ↗ and activate the API for your project.
-
Apply for access to desired models.
https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/google-vertex-ai
When making requests to Google Vertex, you will need:
- AI Gateway account tag
- AI Gateway gateway name
- Google Vertex API key
- Google Vertex Project Name
- Google Vertex Region (for example, us-east4)
- Google Vertex model
Your new base URL will use the data above in this structure: https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/google-vertex-ai/v1/projects/{project_name}/locations/{region}
.
Then you can append the endpoint you want to hit, for example: /publishers/google/models/{model}:{generative_ai_rest_resource}
So your final URL will come together as: https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/google-vertex-ai/v1/projects/{project_name}/locations/{region}/publishers/google/models/gemini-1.0-pro-001:generateContent
Authenticating with Vertex AI normally requires generating short-term credentials using the Google Cloud SDKs ↗ with a complicated setup, but AI Gateway simplifies this for you with multiple options:
AI Gateway supports passing a Google service account JSON directly in the Authorization
header on requests or through AI Gateway's Bring Your Own Keys feature.
You can create a service account key ↗ in the Google Cloud Console. Ensure that the service account has the required permissions for the Vertex AI endpoints and models you plan to use.
AI Gateway uses your service account JSON to generate short-term access tokens which are cached and used for consecutive requests, and are automatically refreshed when they expire.
{ "type": "service_account", "project_id": "your-project-id", "private_key_id": "your-private-key-id", "private_key": "-----BEGIN PRIVATE KEY-----\nYOUR_PRIVATE_KEY\n-----END PRIVATE KEY-----\n", "client_email": "your-service-account@your-project.iam.gserviceaccount.com", "client_id": "your-client-id", "auth_uri": "https://accounts.google.com/o/oauth2/auth", "token_uri": "https://oauth2.googleapis.com/token", "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs", "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/your-service-account%40your-project.iam.gserviceaccount.com", "region": "us-east1"}
You can pass this JSON in the Authorization
header or configure it in Bring Your Own Keys.
If you are already using the Google Cloud SDKs and generating a short-term access token (for example, with gcloud auth print-access-token
), you can directly pass this as a Bearer token in the Authorization
header of the request.
curl "https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/google-vertex-ai/v1/projects/{project_name}/locations/{region}/publishers/google/models/gemini-1.0-pro-001:generateContent" \ -H "Authorization: Bearer ya29.c.b0Aaekm1K..." \ -H 'Content-Type: application/json' \ -d '{ "contents": { "role": "user", "parts": [ { "text": "Tell me more about Cloudflare" } ] } }'
AI Gateway provides a Unified API that works across providers. For Google Vertex AI, you can use the standard chat completions format. Note that the model field includes the provider prefix, so your model string will look like google-vertex-ai/google/gemini-2.5-pro
.
https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat/chat/completions
import OpenAI from 'openai';
const client = new OpenAI({ apiKey: '{service_account_json}', baseURL: 'https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat'});
const response = await client.chat.completions.create({ model: 'google-vertex-ai/google/gemini-2.5-pro', messages: [ { role: 'user', content: 'What is Cloudflare?' } ]});
console.log(response.choices[0].message.content);
curl "https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat/chat/completions" \ -H "Authorization: Bearer {service_account_json}" \ -H 'Content-Type: application/json' \ -d '{ "model": "google-vertex-ai/google/gemini-2.5-pro", "messages": [ { "role": "user", "content": "What is Cloudflare?" } ] }'
You can also use the provider-specific endpoint to access the full Vertex AI API.
curl "https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/google-vertex-ai/v1/projects/{project_name}/locations/{region}/publishers/google/models/gemini-1.0-pro-001:generateContent" \ -H "Authorization: Bearer {vertex_api_key}" \ -H 'Content-Type: application/json' \ -d '{ "contents": { "role": "user", "parts": [ { "text": "Tell me more about Cloudflare" } ] }'
Was this helpful?
- Resources
- API
- New to Cloudflare?
- Directory
- Sponsorships
- Open Source
- Support
- Help Center
- System Status
- Compliance
- GDPR
- Company
- cloudflare.com
- Our team
- Careers
- © 2025 Cloudflare, Inc.
- Privacy Policy
- Terms of Use
- Report Security Issues
- Trademark