I'm having some serious trouble getting the official BigQuery client (in Python 3) to a different project in the same organization authorized from a Cloud Run service.
The Cloud Run service is in Project Main, and I'm trying to access a table in Project Other -- both of these projects are in the same organization.
The service account was a user-created service account created in Project Main (not the default one) and has all required permissions to be run as a service account in Project Main, and the service account's email was given only the "BigQuery User" role in Project Other.
In Python, I have a code snippet like this:
from google.cloud import bigquery
query_str = "SELECT * FROM `project_other.prod.table`"
bqclient = bigquery.Client()
df = bqclient.query(query_str).result().to_dataframe()
This runs inside the container deployed to Cloud Run in project main, and the deployment has the correct service account attached to it.
When triggering the script, I end up with an exception like the following:
raise self._exception google.api_core.exceptions.BadRequest: 400 Access Denied: Table
project_other:prod.table: User does not have permission to query table project_other:prod.table. at [4:13]
From what I understand, if the service account attached to the Cloud Run deployment has the required permissions, you don't have to do any fancy auth in the container as it's all "done for you" by the managed service.
Can someone please help me figure out why this is happening? The service account has iam.serviceAccounts.actAs
in Project Main, but does it need those permissions in Project Other as well or is the BigQuery User role enough?