16

I've created a service account using the Google API Console and wish to use this service account with the Google BigQuery CLI (bq) tool.

I've been using the command line tool to successfully access the BigQuery service using my valid OAuth2 credentials in ~/.bigquery.v2.token, however I can't seem to find any documentation on how to modify this file (or otherwise configure the tool) to use a service account instead.

Here is my current .bigquery.v2.token file

{
    "_module": "oauth2client.client",
    "_class": "OAuth2Credentials",
    "access_token": "--my-access-token--",
    "token_uri": "https://accounts.google.com/o/oauth2/token",
    "invalid": false,
    "client_id": "--my-client-id--.apps.googleusercontent.com",
    "id_token": null,
    "client_secret": "--my-client-secret--",
    "token_expiry": "2012-11-06T15:57:12Z",
    "refresh_token": "--my-refresh-token--",
    "user_agent": "bq/2.0"
}

My other file: ~/.bigqueryrc generally looks like this:

project_id = --my-project-id--
credential_file = ~/.bigquery.v2.token

I've tried setting the credential_file paramater to the .p12 private key file for my service account but with no luck, it gives me back the following error

******************************************************************
** No OAuth2 credentials found, beginning authorization process **
******************************************************************

And asks me to go to a link in my browser to set up my OAuth2 credentials again.

The command line tools' initial configuration option "init":

bq help init

displays no helpful information about how to set up this tool to use a service account.

5 Answers 5

11

I ended up finding some documentation on how to set this up

$ bq --help

....

--service_account: Use this service account email address for authorization. For example, [email protected].
(default: '')

--service_account_credential_file: File to be used as a credential store for service accounts. Must be set if using a service account.

--service_account_private_key_file: Filename that contains the service account private key. Required if --service_account is specified.
(default: '')

--service_account_private_key_password: Password for private key. This password must match the password you set on the key when you created it in the Google APIs Console. Defaults to the default Google APIs Console private key password.
(default: 'notasecret')

....

You can either set these specifically on each bq (bigquery commandline client) request, ie:

$ bq --service_account --my-client-id--.apps.googleusercontent.com -- service_account_private_key_file ~/.bigquery.v2.p12 ... [command]

Or you can set up defaults in your ~/.bigqueryrc file like so

project_id = --my-project-id--
service_account = [email protected]
service_account_credential_file = /home/james/.bigquery.v2.cred
service_account_private_key_file = /home/james/.bigquery.v2.p12

The service account can be found in the Google API Console, and you set up service_account_private_key_password when you created your service account (this defaults to "notasecret").

note: file paths in .bigqueryrc had to be the full path, I was unable to use ~/.bigquery...

Some additional dependencies were required, you will need to install openssl via yum/apt-get

--yum--
$ yum install openssl-devel libssl-devel

--or apt-get--
$ apt-get install libssl-dev

and pyopenssl via easy install/pip

--easy install--
$ easy_install pyopenssl

--or pip--
$ pip install pyopenssl
3
  • Following the directions for doing service account using the docker image is substantially easier on OS X at least: hub.docker.com/r/google/cloud-sdk
    – mentat
    Commented Aug 18, 2016 at 19:57
  • 2
    The right way to use service account credentials on bq cli is to activate it using the "gcloud auth activate-service-account" command and then run bq without any auth flags. Those bq flags are for the gcloud internal use only. Please do not set them, it interferes with the gcloud and leads to unpredicted results.
    – Daria
    Commented Feb 7, 2018 at 16:52
  • 1
    check the other answer, this is now DEPRECATED
    – Temu
    Commented Oct 19, 2022 at 16:09
8

1.) Tell gcloud to authenticate as your service account

gcloud auth activate-service-account \
[email protected] \
--key-file=/path/key.json \
--project=testproject

2.) Run a bq command as you would with your user account

# ex: bq query
bq query --use_legacy_sql=false 'SELECT CURRENT_DATE()'

3. optional) Revert gcloud authentication to your user account

gcloud config set account [email protected]

3a. optional) See who gcloud uses for authentication

gcloud auth list
5

The bq authorization flags are now deprecated

bq documentation

1

The bq tool requires two configuration files, controlled by the --bigqueryrc and the --credential_file flag. If neither one is found, bq will attempt to automatically initialize during start up.

To avoid this for the --bigqueryrc file, you can place a ".bigqueryrc" file in the default location, or override it with --bigqueryrc to some writeable file path.

0

For anyone else who comes along struggling to use bq with a service account... I had a seriously hard time getting this to work inside of a CI/CD pipeline using the Google Cloud SDK docker images on gitlab-ci. Turns out the missing bit for me was making sure to set the default project. On my laptop gcloud was happy inferring the default project from the service account, but for some reason the version within the docker image was defaulting to a public free project.

- gcloud auth activate-service-account --key-file=${PATH_TO_SVC_ACCT_JSON};
- gcloud config set project ${GOOGLE_BIGQUERY_PROJECT}

after this I was able to use the bq utility as the service account. I imagine setting the default project in the .bigqueryrc file does the trick too, which is why the OP didn't run into this issue.

Not the answer you're looking for? Browse other questions tagged or ask your own question.