23

I am working on a bitbucket pipeline for pushing image to gc container registry. I have created a service account with Storage Admin role. (bitbucket-authorization@mgcp-xxxx.iam.gserviceaccount.com)

enter image description here

gcloud auth activate-service-account --key-file key.json
gcloud config set project mgcp-xxxx
gcloud auth configure-docker --quiet
docker push eu.gcr.io/mgcp-xxxx/image-name

Although that the login is successful, i get: Token exchange failed for project 'mgcp-xxxx'. Caller does not have permission 'storage.buckets.get'. To configure permissions, follow instructions at: https://cloud.google.com/container-registry/docs/access-control

Can anyone advice on what i am missing?

Thanks!

Maxim
  • 3,673
  • 1
  • 11
  • 22
Tania Petsouka
  • 1,261
  • 1
  • 8
  • 17

19 Answers19

23

For anyone reading all the way here. The other suggestions here did not help me, however I found that the Cloud Service Build Account role was also required. Then the storage.buckets.get dissappears.

This is my minimal role (2) setup to push docker images: auomationroles

The Cloud Service Build Account role however adds many more permissions that simply storage.buckets.get. The exact permissions can be found here.

note: I am well aware the Cloud Service Build Account role also adds the storage.objects.get permission. However, adding roles/storage.objectViewerdid not resolve my problem. Regardless of the fact it had the storage.objects.get permission.

If the above does not work you might have the wrong account active. This can be resolved with:

gcloud auth activate-service-account --key-file key.json

If that does not work you might need to set the docker credential helpers with:

gcloud auth configure-docker --project <project_name>

On one final note. There seemed to be some delay between setting a role and it working via the gcloud tool. This was however minimal, think of a scope less than a minute.

Cheers

Shine
  • 576
  • 4
  • 12
  • loud auth activate-service-account --key-file was the solution for me. Thanks – Jørgen Jul 08 '20 at 08:01
  • 2
    Thanks for this! Not sure if they renamed it, but it's now "Cloud Build Service Account" rather than "Cloud Service Build Account". If you need to set it from the command line or API, the name of the role is `roles/cloudbuild.builds.builder` – Matt Browne Apr 02 '21 at 17:20
  • thanks man, the Cloud Service Build Account role is definitly required, the google doc is missing that – Thomas Ducrot Nov 16 '21 at 20:04
16

In the past I had another service account with same name and different permissions. After discovering that service account names are cached, I created a new service account with different name and it's pushing properly.

Tania Petsouka
  • 1,261
  • 1
  • 8
  • 17
16

You need to be logged into your account and set the project to the project you'd like. There is a good chance you're just not logged in.

gcloud auth login

gcloud config set project <PROJECT_ID_HERE>

Patrick Collins
  • 4,220
  • 1
  • 17
  • 47
5

These are step-by step commands which got me to push first container to a GCE private repo:

export PROJECT=pacific-shelter-218
export KEY_NAME=key-name1
export KEY_DISPLAY_NAME='My Key Name'

sudo gcloud iam service-accounts create ${KEY_NAME} --display-name ${KEY_DISPLAY_NAME}
sudo gcloud iam service-accounts list
sudo gcloud iam service-accounts keys create --iam-account ${KEY_NAME}@${PROJECT}.iam.gserviceaccount.com key.json
sudo gcloud projects add-iam-policy-binding ${PROJECT} --member serviceAccount:${KEY_NAME}@${PROJECT}.iam.gserviceaccount.com --role roles/storage.admin
sudo docker login -u _json_key -p "$(cat key.json)" https://gcr.io
sudo docker push  gcr.io/pacific-shelter-218/mypcontainer:v2
vitaly goji
  • 282
  • 4
  • 13
  • 1
    thank you! That finally worked for me. I tried creating the service account via the UI, and it would simply not work. I'll never know why. – jpbochi Dec 09 '19 at 11:46
  • I ran into similar problems with the UI. I managed to get it to work via the UI by adding the role: Editor ... but admit this is a rather wrecklessly insecure blunderbus approach. – andrew pate Feb 10 '20 at 18:21
  • Using service account for users is a terrible idea. Users should use user accounts. – Francisco Delmar Kurpiel Apr 17 '20 at 10:18
4

for anyone else coming across this, my issue was that I had not granted my service account Storage legacy bucket reader. I'd only granted it Object viewer. Adding that legacy permission fixed it.

It seems docker is still using a legacy method to access GCR

Jonas D
  • 209
  • 3
  • 9
3

Here in the future, I've discovered that I no longer have any Legacy options. In this case I was forced to grant full Storage Admin. I'll open a ticket with Google about this, that's a bit extreme to allow me to push an image. This might help someone else from the future.

MXWest
  • 311
  • 2
  • 6
3

It seems the documentation is outdated https://cloud.google.com/container-registry/docs/access-control

Note: Pushing images requires object read and write permissions as well as the storage.buckets.get permission. The Storage Object Admin role does not include the storage.buckets.get permission, but the Storage Legacy Bucket Writer role does.

But Storage Legacy Bucket Writer Role no longer available.

To fix the permission issue I've added two Roles to the Service Account

  • Storage Admin
  • Storage Object Viewer (it has storage.buckets.get permission)
Pavel Shastov
  • 2,141
  • 13
  • 23
2

Tried several things, but it seems you have to run gcloud auth configure-docker

Nebulastic
  • 7,845
  • 2
  • 31
  • 56
1

add service account role

on google cloud IAM

Editor Storage object Admin Storage object Viewer

fix for me

s4wet
  • 11
  • 2
1

I think the discrepancy is that https://cloud.google.com/container-registry/docs/access-control says, during the #permissions_and_roles section that you need the Storage Admin role in order to push images. However, in the next section that explains how to configure access, it says to add Storage Object Admin to enable push access for the account you're wishing to configure. Switching to Storage Admin should fix the issue.

dcow
  • 7,487
  • 3
  • 43
  • 65
0

GCR just uses GCS to store images check the permissions on your artifacts. folder in GCS within the same project.

Dan
  • 194
  • 5
0

I had a hard time figuring this out.

Although the error message was the same, my issue was that i was using the project name and not the project ID in the Image URL.

Natalia C
  • 196
  • 1
  • 7
0

I created a separate service account to handle GCR IO. Added Artifact Registry Administrator role (I need to push and pull images) and it started to push the images again to GCR

DK_DEV
  • 41
  • 3
0

docker push command will return this permission error if docker is not authenticated with grc.io

Follow below steps.

  1. Create a service account (or use an existing one) and grant following privileges

    • Storage Admin
    • Storage Object Admin
  2. Generate a service account key (JSON) and download it

  3. Run docker-credential-gcr configure-docker

  4. Docker login with service account

    docker login -u _json_key -p "$(cat [SERVICE_ACCOUNT_KEY.json])" https://gcr.io

  5. Try to push your docker image to gcr

    docker push gcr.io/<project_id>/<image>:<tag>

Thushan
  • 1,060
  • 12
  • 14
0

Pushing images requires object read and write permissions as well as the storage.buckets.get permission. The Storage Object Admin role does not include the storage.buckets.get permission, but the Storage Legacy Bucket Writer role does. You can find this under a note https://cloud.google.com/container-registry/docs/access-control

So Adding the Storage Legacy Bucket Writer Role fixed for me. As the Storage Object Admin role doesnt have required storage.buckets.get Permission.

  • Please provide an explanation of your answer so that the next user knows why this solution worked for you. Also, sum up your answer in case the link stops working in the future. – Elydasian Jul 21 '21 at 08:38
0

What worked for me was going to google cloud console -> I AM & Admin -> Setting storage admin as one of the roles for the service account .

I.Tyger
  • 677
  • 1
  • 8
  • 17
0

Spent too long figuring this out as I was pushing images from one project to another (intentionally) and didn't clock which bucket I needed to grant access to.

Details are as per https://cloud.google.com/container-registry/docs/access-control.

In summary, to push images using the Cloud Build container registry you need to grant Storage Legacy Bucket Writer to the {projectB-id}@cloudbuild.gserviceaccount.com service account on the {storage-region.}artifacts.{projectA-name}.appspot.com bucket.

barclakj
  • 11
  • 2
0

UI method:

  1. Add rights on https://console.cloud.google.com/iam-admin/iam
  2. Role is "Storage Admin" as explained in [1]
  3. Then refresh your token with:
    gcloud auth login
    gcloud config set project <PROJECT_ID_HERE>
  1. Push again

Links that can help: [1] https://cloud.google.com/storage/docs/access-control/iam-roles [2] https://cloud.google.com/container-registry/docs/access-control#grant

0

In my case, this error was caused by the Storage API (used to push Google Container Registry images to) having been put inside a VPC service perimeter.

This can be confirmed and diagnosed further if required by looking through the logs accessible via the VPC Service Controls troubleshooting page.

u-phoria
  • 354
  • 4
  • 11