5

I'm trying to make automatic publishing using docker + bitbucket pipelines; unfortunately, I have a problem. I read the pipelines deploy instructions on Docker Hub, and I created the following template:

# This is a sample build configuration for Docker.
# Check our guides at https://confluence.atlassian.com/x/O1toN for more examples.
# Only use spaces to indent your .yml configuration.
# -----
# You can specify a custom docker image from Docker Hub as your build environment.
image: atlassian/default-image:2

pipelines:
  default:
    - step:
        services:
          - docker
        script: # Modify the commands below to build your repository.
          # Set $DOCKER_HUB_USERNAME and $DOCKER_HUB_PASSWORD as environment variables in repository settings
          - export IMAGE_NAME=paweltest/tester:$BITBUCKET_COMMIT

          # build the Docker image (this will use the Dockerfile in the root of the repo)
          - docker build -t paweltest/tester .
          # authenticate with the Docker Hub registry
          - docker login --username $DOCKER_HUB_USERNAME --password $DOCKER_HUB_PASSWORD
          # push the new Docker image to the Docker registry
          - docker push paweltest/tester:tagname

I have completed the data, but after doing the push, I get the following error when the build starts:

unable to prepare context: lstat/opt/atlassian/pipelines/agent/build/Dockerfile: no dry file or directory

What would I want to achieve? After posting changes to the repository, I'd like for an image to be automatically built and sent to the Docker hub, preferably to the target server where the application is.

I've looked for a solution and tried different combinations. For now, I have about 200 commits with Failed status and no further ideas.

1 Answer 1

8

Bitbucket pipelines is a CI/CD service, you can build your applications and deploy resources to production or test server instance. You can build and deploy docker images too - it shouldn't be a problem unless you do something wrong...

All defined scripts in bitbucket-pipelines.yml file are running in a container created from the indicated image(atlassian/default-image:2 in your case)

You should have Dockerfile in the project and from this file you can build and publish a docker image.

I created simple repository without Dockerfile and I started build:

unable to prepare context: unable to evaluate symlinks in Dockerfile path: lstat /opt/atlassian/pipelines/agent/build/Dockerfile: no such file or directory

I need Dockerfile in my project to build an image(at the same level as the bitbucket-pipelines.yml file):

enter image description here

FROM node:latest

WORKDIR /src/
EXPOSE 4000

In next step I created a public DockerHub repository:

enter image description here

I also changed your bitbucket-pipelines.yml file(you forgot to mark the new image with a tag):

image: atlassian/default-image:2

pipelines:
  default:
    - step:
        services:
          - docker
        script: 
          # build the Docker image (this will use the Dockerfile in the root of the repo)
          - docker build -t appngpl/stackoverflow-question-56065689 .
          # add new image tag
          - docker tag appngpl/stackoverflow-question-56065689 appngpl/stackoverflow-question-56065689:$BITBUCKET_COMMIT
          # authenticate with the Docker Hub registry
          - docker login --username $DOCKER_HUB_USERNAME --password $DOCKER_HUB_PASSWORD
          # push the new Docker image to the Docker registry
          - docker push appngpl/stackoverflow-question-56065689:$BITBUCKET_COMMIT

Result:

enter image description here enter image description here

Everything works fine :)

Bitbucket repository: https://bitbucket.org/krzysztof-raciniewski/stackoverflow-question-56065689

GitHub image repository: https://hub.docker.com/r/appngpl/stackoverflow-question-56065689

2
  • Everything works great, but now the question is, is it possible to push such an image to the server where the docker is? that is, after building it went automatically, for example, to the dockerhub or the web server. Because this is what I mean, I have on the Docker's VPS server, bitbucket builds my picture and now I would like him to send to the docker hub what works, and the web server to update the image there and launch the new version of the application.
    – PawelC
    Commented May 10, 2019 at 5:59
  • 1
    Yes it is possible. You need to pull the newest image version, stop running container, remove old container and start new one. More about this operation you can read here: stackoverflow.com/questions/26734402/…. All this commands you can run from bitbucket pipelines using SSH with -t parameter(more here: garron.me/en/go2linux/ssh-sudo-run-commands-remote-server.html). Remember that you need to add public RSA key to repo configuration to allow connection to your VPS server from Bitbucket Pipelines Service. Good luck :) Commented May 10, 2019 at 7:30

Not the answer you're looking for? Browse other questions tagged or ask your own question.