Skip to main content

Securing Your Looker Extensions with Cloud Run: A Complete Guide

· 12 min read

Looker extensions provide a powerful way to extend your Looker instance beyond what the standard API offers. However, when your extensions need to connect to external services or run custom code, security becomes essential. This comprehensive guide covers how to securely integrate Looker extensions with Google Cloud Run, keeping your data and services protected.

The Problem: Extending Looker Beyond the API

Looker's API is powerful, but there are scenarios where you need capabilities beyond simple data queries and dashboard creation. Every day use cases include:

  • Third-party API integrations: Connecting to external services like CRM systems, marketing platforms, or payment processors
  • Database write-backs: Storing computed results or user interactions back to your data warehouse
  • LookML generation: Programmatically creating or modifying LookML models based on business logic
  • Form submissions: Processing user input and triggering complex workflows
  • Data transformations: Running custom algorithms or data processing that can't be done in SQL

When these requirements arise, you need a secure, scalable way to run custom code while ensuring security and performance standards for enterprise environments.

The Power of Extension Framework's serverProxy:

One of the most powerful features of the Looker Extension Framework is the serverProxy function. This function allows your extension's frontend code to make secure requests to external services through Looker's backend, with access to secrets and authentication that would be unsafe to expose to the client.

How serverProxy Works

The serverProxy function acts as a secure bridge between your extension's frontend and external services. Here's how it works:

  1. Your extension's frontend code calls serverProxy
  2. Looker's backend receives the request and validates it
  3. Looker makes the actual HTTP request to the external service
  4. The response is returned to your extension's frontend

This approach provides the security benefit of doing secret management in Looker's backend

Example: Using serverProxy with a Third-Party API

Here's a practical example of how to use serverProxy to securely call an external API:

import { LookerEmbedSDK } from '@looker/embed-sdk'
import { LookerExtensionSDK } from '@looker/extension-sdk'

// Initialize the extension SDK
const extensionSdk = LookerExtensionSDK.createClient()

// Function to call external API through serverProxy
async function callExternalAPI(data: any) {
try {
const response = await extensionSdk.serverProxy('/api/external-service', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify(data)
})

return response.body
} catch (error) {
console.error('Error calling external API:', error)
throw error
}
}

// Example usage in a React component
function MyExtensionComponent() {
const [result, setResult] = useState(null)
const [loading, setLoading] = useState(false)

const handleSubmit = async (formData: any) => {
setLoading(true)
try {
const response = await callExternalAPI(formData)
setResult(response)
} catch (error) {
console.error('Failed to process data:', error)
} finally {
setLoading(false)
}
}

return (
<div>
{/* Your UI components */}
<button
onClick={() => handleSubmit({ action: 'process_data' })}
disabled={loading}
>
{loading ? 'Processing...' : 'Process Data'}
</button>
{result && <div>Result: {JSON.stringify(result)}</div>}
</div>
)
}

Configuring serverProxy Endpoints

To use serverProxy, you need to configure the endpoints in your LookML project's manifest.lkml. This is typically done through Looker's admin interface or API:

# Example serverProxy configuration
application: my_extension {
label: "My Extension"
file: "bundle.js"
entitlements: {
core_api_methods: ["me", "lookml_model_explore", "all_lookml_models"]
external_api_urls: ["https://my-looker-extension.com/api/lookml"]
}
}

When a Simple API Call Isn't Enough: The Need for Custom Code

While serverProxy is excellent for simple API calls, there are scenarios where you need to run custom code that can't be handled by a simple HTTP request:

  • Complex data processing: Algorithms that require significant computational resources
  • Long-running operations: Tasks that take minutes or hours to complete
  • Stateful operations: Processes that need to maintain state across multiple requests
  • Integration with Google Cloud services: Direct access to BigQuery, Cloud Storage, or other GCP services
  • Custom authentication flows: Complex OAuth or service-to-service authentication

In these cases, you need a more robust solution that can run custom code in a secure, scalable environment.

Enter Cloud Functions and Cloud Run

Google Cloud provides two excellent services for running custom code: Cloud Functions and Cloud Run. Both services offer serverless execution environments, but they serve different use cases.

Cloud Functions: Event-Driven Serverless Computing

Cloud Functions are perfect for event-driven, stateless operations that respond to specific triggers. They're ideal for:

  • HTTP requests: REST API endpoints
  • Cloud events: Pub/Sub messages, Cloud Storage events
  • Scheduled tasks: Cron jobs via Cloud Scheduler
  • Lightweight processing: Simple data transformations

Here's an example of a Cloud Function that could be called from a Looker extension:

// cloud-function-example/index.ts
import { Request, Response } from 'express'

export const processLookerData = async (req: Request, res: Response) => {
try {
// Validate the request
if (!req.body || !req.body.data) {
return res.status(400).json({ error: 'Missing data parameter' })
}

// Process the data (example: complex calculation)
const { data } = req.body
const processedData = await performComplexCalculation(data)

// Return the result
res.status(200).json({
success: true,
result: processedData,
timestamp: new Date().toISOString()
})
} catch (error) {
console.error('Error processing data:', error)
res.status(500).json({ error: 'Internal server error' })
}
}

async function performComplexCalculation(data: any) {
// Example: Complex business logic that can't be done in SQL
const result = await someComplexAlgorithm(data)
return result
}

Cloud Run: Containerized Applications

Cloud Run is designed for containerized applications that need more control over the runtime environment. It's ideal for:

  • Complex applications: Multi-service architectures
  • Custom runtimes: Applications that need specific dependencies
  • Long-running processes: Background jobs and workers
  • High-performance computing: CPU or memory-intensive tasks

Authentication Strategies: Securing Your Cloud Services

When integrating Cloud Functions or Cloud Run with Looker extensions, you have several authentication options, each with its own trade-offs.

Option 1: Unauthenticated Access

Allowing unauthenticated access is the simplest approach but provides no security:

# Deploy Cloud Function with unauthenticated access
gcloud functions deploy process-looker-data \
--runtime nodejs18 \
--trigger-http \
--allow-unauthenticated

Pros:

  • Simple to implement
  • No authentication overhead
  • Easy to test and debug

Cons:

  • Anyone can call your service
  • Security needs to be handled in the service itself
  • Risk of abuse and unexpected costs

Here is an example of how to use API keys to secure a Cloud Function with unauthenticated access:

// Cloud Function with API key validation
export const secureFunction = async (req: Request, res: Response) => {
const apiKey = req.headers['x-api-key']
const expectedKey = process.env.API_KEY

if (!apiKey || apiKey !== expectedKey) {
return res.status(401).json({ error: 'Unauthorized' })
}

// Process the request
// ...
}

Option 2: OAuth 2.0

Using OAuth 2.0 is a good option for securing your Cloud Function or Cloud Run and Looker's extension framework has built in support for OAuth2.

// Cloud Function with JWT validation
import { OAuth2Client } from 'google-auth-library'

const client = new OAuth2Client(process.env.GOOGLE_CLIENT_ID)

export const secureFunction = async (req: Request, res: Response) => {
try {
const token = req.headers.authorization?.replace('Bearer ', '')

if (!token) {
return res.status(401).json({ error: 'No token provided' })
}

// Verify the token
const ticket = await client.verifyIdToken({
idToken: token,
audience: process.env.GOOGLE_CLIENT_ID
})

const payload = ticket.getPayload()
if (!payload) {
return res.status(401).json({ error: 'Invalid token' })
}

// Token is valid, process the request
const userId = payload.sub
const userEmail = payload.email

// Your business logic here
// ...

} catch (error) {
console.error('Token verification failed:', error)
res.status(401).json({ error: 'Invalid token' })
}
}

Then in your Looker extension, you can use the the Extension SDK's functions to do OAuth2. See more examples in the Extension SDK documentation.

try {
const response = await extensionSDK.oauth2Authenticate(
'https://accounts.google.com/o/oauth2/v2/auth',
{
client_id: GOOGLE_CLIENT_ID,
scope: GOOGLE_SCOPES,
response_type: 'token',
}
);
const { access_token, expires_in } = response;
// The user is authenticated, it does not mean the user is authorized.
// There may be further work for the extension to do.
} catch (error) {
// The user failed to authenticate
}

Pros:

  • Highest security level
  • User-level authentication
  • Rich audit capabilities
  • Integrates well with Google Cloud IAM

Cons:

  • More complex to implement
  • Additional latency for token verification
  • Requires each user to authenticate with the OAuth2 provider

Option 3: Service-to-Service Authentication

Service-to-service authentication is a good option for securing your Cloud Function or Cloud Run to Looker. When doing service to service communicaiton, you need a short lived GCP identity token to authenticate with the service.

You will need Looker to store the identity token and use it with serverProxy to reach to the service. The recommendation is to use user attributes for these secrets so they cana be locked down for only Looker to use, and the URL its used with needs to be explicitly allowed.

Looker User Attributes have three ways you can store these secrets:

  • Default Values: Anyone can use these values in the extension
  • Group Values: Only users that belong to the Looker group can use the values
  • User Values: Only the user that owns the value can use the values
info

We recommend using this method as it is the most flexible way to secure your service while centralizing access to the the service through Looker's users and group permissioning.

Using the User Attribute Updater Cloud Run Service

w The User Attribute Updater as part of the lkr cli is a common pattern in Looker extensions where you need to update user attributes based on external data or user actions. So much so we have made a simple container for you to launch with your own cloud run service.

Below is an example of how to quickly get an identity token into your Looker instance through User Attributes so that you can use Cloud Functions and Cloud Runs in your Looker extension. We will give you gcloud commands you can run within the shell editor. The code below is just for running retrieving the identity token and updating the user attribute. This assumes you already have a Cloud Run service that can be called by Looker Extension that has the functionality you require.

  1. Create a service account for Looker extensions

    export PROJECT_ID=<your project id>
    gcloud iam service-accounts create looker-extension-service-account \
    --display-name="Looker Extension Service Account"
    export SERVICE_ACCOUNT_EMAIL="looker-extension-service-account@($PROJECT_ID).iam.gserviceaccount.com"
  2. Create a Cloud Run service for updating the user attribute

    export REGION=<your region>
    export LOOKERSDK_CLIENT_ID=<your client id>
    export LOOKERSDK_CLIENT_SECRET=<your client secret>
    export LOOKERSDK_BASE_URL=<your instance url>
    export CLOUD_RUN_SERVICE_URL=<your existing cloud run service url that needs a token>

    gcloud run deploy lkr-access-token-updater \
    --image us-central1-docker.pkg.dev/lkr-dev-production/lkr-cli/cli:latest \
    --command lkr \
    --args tools,user-attribute-updater \
    --platform managed \
    --region $REGION \
    --project $PROJECT_ID \
    --cpu 1 \
    --memory 2Gi \
    --set-env-vars LOOKERSDK_CLIENT_ID=$LOOKERSDK_CLIENT_ID,LOOKERSDK_CLIENT_SECRET=$LOOKERSDK_CLIENT_SECRET,LOOKERSDK_BASE_URL=$LOOKERSDK_BASE_URL,LOOKER_WHITELISTED_BASE_URLS=$LOOKERSDK_BASE_URL
  3. Retrive the URL of the Cloud Run service

    export LOOKER_USER_ATTRIBUTE_UPDATER_CLOUD_RUN_URL=$(gcloud run services describe lkr-access-token-updater \
    --region $REGION \
    --format 'value(status.url)')

    echo $LOOKER_USER_ATTRIBUTE_UPDATER_CLOUD_RUN_URL
  4. Create a user attribute in Looker, for example lkr_cloud_run_identity_token and add the URL of the Cloud Run service to the domain allow list.

    • Navigate to Admin > User Attributes
    • Create a new user attribute
    • Select the "Hide Value" option
    • In the domain allow list, add the URL of the existing Cloud Run service you put in CLOUD_RUN_SERVICE_URL. This will not need to be the same URL as the one you just created.
    • Click the "Save" button
  5. Create a cloud run schedule to update the user attribute. This will run every hour.

    export AUDIENCE="$(CLOUD_RUN_SERVICE_URL),$(LOOKER_USER_ATTRIBUTE_UPDATER_CLOUD_RUN_URL)"
    export SERVICE_ACCOUNT_KEY_FILE="$SERVICE_ACCOUNT_KEY_FILE"

    gcloud scheduler jobs create http looker-identity-token-user-attribute-sync \
    --schedule="0 * * * *" \
    --uri="https://lkr-access-token-updater-xxxxx-uc.a.run.app/api/sync-attributes" \
    --http-method=POST \
    --headers="Content-Type=application/json" \
    --oidc-service-account-email="$(SERVICE_ACCOUNT_EMAIL)" \
    --oidc-token-audience="$(AUDIENCE)"
    --data

Looker Extension Service Account

Create your service account and let it invoke Cloud Run services.

gcloud iam service-accounts create looker-extension-service-account \
--display-name="Looker Extension Service Account"

Create a User Attribute in Looker

  1. Navigate to Admin > User Attributes
  2. Create a new user attribute
  3. Select the "Default Value" option
  4. Enter the value for the user attribute
  5. Click the "Save" button




For operations that need to run on a schedule (like batch updates, data synchronization, or maintenance tasks), Cloud Scheduler provides a reliable way to trigger your Cloud Run services.


### Service Account Setup

```bash
# Create a service account for the scheduler
gcloud iam service-accounts create user-attribute-scheduler \
--display-name="User Attribute Scheduler"

# Grant the service account permission to invoke Cloud Run
gcloud run services add-iam-policy-binding user-attribute-updater \
--member="serviceAccount:user-attribute-scheduler@your-project.iam.gserviceaccount.com" \
--role="roles/run.invoker"

Creating a Scheduled Job

# Create a Cloud Scheduler job that calls your Cloud Run service
gcloud scheduler jobs create http user-attribute-sync \
--schedule="0 2 * * *" \
--uri="https://user-attribute-updater-xxxxx-uc.a.run.app/api/sync-attributes" \
--http-method=POST \
--headers="Content-Type=application/json" \
--oidc-service-account-email="user-attribute-updater@your-project.iam.gserviceaccount.com" \
--oidc-token-audience="https://user-attribute-updater-xxxxx-uc.a.run.app"

Scheduled Service Endpoint

Conclusion

Securing Looker extensions with Cloud Run provides a robust, scalable solution for extending Looker's capabilities while maintaining enterprise-grade security. By following the patterns outlined in this guide, you can:

  • Maintain security: Use proper authentication and authorization
  • Scale efficiently: Leverage Google Cloud's serverless infrastructure
  • Monitor effectively: Implement comprehensive logging and alerting
  • Comply with regulations: Follow security best practices for enterprise environments

The combination of Looker's Extension Framework, Cloud Run's containerized execution environment, and Google Cloud's security features creates a powerful platform for building secure, scalable extensions that can handle complex business logic while maintaining the security standards your organization requires.