Skip to content

NimTechnology

Trình bày các công nghệ CLOUD một cách dễ hiểu.

  • Kubernetes & Container
    • Docker
    • Kubernetes
      • Ingress
    • Helm Chart
    • Isito-EnvoyFilter
    • Apache Kafka
      • Kafka
      • Kafka Connect
      • Lenses
    • Vault
    • Longhorn – Storage
    • VictoriaMetrics
    • MetalLB
    • Kong Gateway
  • CI/CD
    • ArgoCD
    • ArgoWorkflows
    • Spinnaker
    • Jenkins
    • Harbor
    • TeamCity
    • Git
      • Bitbucket
  • Coding
    • Terraform
      • GCP – Google Cloud
      • AWS – Amazon Web Service
    • Golang
    • Laravel
    • Python
    • Jquery & JavaScript
    • Selenium
  • Log & Monitor
    • DataDog
    • Prometheus
    • Grafana
    • ELK
      • Kibana
      • Logstash
  • BareMetal
    • NextCloud
  • Toggle search form

[Bitbucket] You can run and make anything on Bitbucket Pipeline

Posted on August 31, 2022November 15, 2022 By nim No Comments on [Bitbucket] You can run and make anything on Bitbucket Pipeline

Giờ chúng ta tìm hiểu về Bitbucket Pipeline.

Bạn có thể xem video ở trên trước

Mình đã có 1 bài hướng dẫn chạy terragrunt
https://nimtechnology.com/2022/08/26/terraform-using-terragrunt-to-provision-aws-base-on-terraform-module/

Giờ mình sẽ kết hợp với Bitbucket Pipeline và terragrunt để deploy 1 con EC2

alpine/terragrunt:1.2.8-eks

Contents

  • 1) Set up and enable bitbucket-pipeline
  • 2) Declare the global environment for bitbucket-pipeline
  • 3) Demo
  • 4) Creating many environment variables/system for many cases on Bitbucket Agent
    • occurs multiple times in the pipeline
  • 5) Create dropdown menu or selected options in bitbucket-pipeline
  • 6) Manually submit step in bitbucket pipeline
  • 7) Save files/folders inside any step and reuse them again at other steps with the bitbucket pipeline
  • 8) Using an individual image for each step in bitbucket pipeline
  • 9) Multi lines in a comand of script
  • 10) Run the command: “docker run …” on bitbucket-pipeline
  • 11) Reuse code in bitbucket-pipeline.
    • 11.1) Reuse scripts
    • 11.2) Reuse scripts
  • End) summarize many formulas in bitbucket pipeline

1) Set up and enable bitbucket-pipeline

Để enable tính năng này thì cần “Two-step verification is required to enable Pipelines”
Giờ bạn lên email confirm nhé!
Như này là ok
Giờ bạn đã enable Pipeline của repo

2) Declare the global environment for bitbucket-pipeline

Bạn add các variable cho Runners

Vì là mình chạy terragunt nên mình sẽ add env như hình dưới!

Đây là cách bố trí thư mục của mình

bitbucket-pipelines.yml

image: 
  name: alpine/terragrunt:1.2.8-eks
  # aws: 
  #   access-key: $AWS_ACCESS_KEY_ID_dev
  #   secret-key: $AWS_SECRET_ACCESS_KEY_dev

pipelines:
  custom:
    Remote-state:
      - step:
          name: Terraform apply S3
          script:
            - cd terragrunt_s3
            - terragrunt init
            - terragrunt plan
            - terragrunt apply -auto-approve
      - step:
          name: Terraform apply dynamodb
          script:
            - cd terragrunt_dynamodb
            - terragrunt init
            - terragrunt plan
            - terragrunt apply -auto-approve
    PLAN-dev:
      - variables:
          - name: ENV
          - name: AWS_REGION
      - step:
          name: Terraform Plan
          script:
            - cd terragrunt-ec2/${ENV}
            - pwd
            - env
            - AWS_CONFIG_FILE=credentials
            - echo "[default]"                              > $AWS_CONFIG_FILE
            - echo "aws_access_key_id=${AWS_ACCESS_KEY_ID}"         >> $AWS_CONFIG_FILE
            - echo "aws_secret_access_key=${AWS_SECRET_ACCESS_KEY}" >> $AWS_CONFIG_FILE
            # - |
            #   cat <<'EOF' >credentials
            #   [default]
            #   aws_access_key_id = $AWS_ACCESS_KEY_ID
            #   aws_secret_access_key = $AWS_SECRET_ACCESS_KEY
            #   EOF
            - cat credentials
            - terragrunt init
            - terragrunt plan
            - terragrunt apply -auto-approve

Đơn gian là anh em ấn run và input vào 1 số variable nếu cần
bitbucket nó chạy các command mà anh em khai báo

Nếu anh em chưa biêt terragrunt thì thao khảo bài này nhé

[Terraform ] Using Terragrunt to provision AWS base on Terraform Module

Đương nhiên là anh em có chạy với container terraform nhé!

3) Demo

mình thử provisioning EC2
nó thực hiện init các kiểu con đà điểu

nó sẽ watch đoạn code dưới và run

4) Creating many environment variables/system for many cases on Bitbucket Agent

Cây này minh nghĩ là sẽ phổ biến.
bạn đang control nhiều AWS account và bạn tạo nhiều loại resource trong đó.

Đương nhiên là bạn muốn mới mỗi case thì chúng ta sẽ sử dụng một credential thích hợp.

trong file bitbucket pipeline thì sử dụng line này.

pipelines:
  custom:
    Remote-state:
      - step:
          deployment: aws-nimtechnology

occurs multiple times in the pipeline

The deployment environment ‘xxx-nimtechnology’ in your bitbucket-pipelines.yml file occurs multiple times in the pipeline. Please refer to our documentation for valid environments and their ordering.

Lỗi này sau ra khi mà bạn reuse “deployment: xxx-nimtechnology” trong nhiều step cũng thuộc 1 pipeline

5) Create dropdown menu or selected options in bitbucket-pipeline

Link tham khảo:
https://bitbucket.org/blog/predefine-values-of-custom-pipeline-variables

Mình thấy cái khá tiện và cũng đơn giản nếu người mới cũng chả biết nên input cái j?
Bạn làm sẵn cho họ mấy chọn để họ đỡ sai.

pipelines:
  custom:
    run-test-for-environment:
      - variables:
          - name: Environment
            default: production  
            allowed-values:         # optionally restrict variable values
             - dev
             - staging
             - production
      - step: 
          script:
            - echo "environment is $Environment"

6) Manually submit step in bitbucket pipeline

Click Deploy
image: python:3.6.3
 
pipelines:
  default:
    - step:
        name: Build and push to S3
        script:
          - apt-get update
          - apt-get install -y python-dev
          - curl -O https://bootstrap.pypa.io/get-pip.py
          - python get-pip.py
          - pip install awscli
          - aws deploy push --application-name $APPLICATION_NAME --s3-location s3://$S3_BUCKET/test_app_$BITBUCKET_BUILD_NUMBER --ignore-hidden-files
    - step:
        name: Deploy to test
        script:
          - python deploy_to_test.py
    - step:
        name: Deploy to staging
        trigger: manual
        script:
          - python deploy_to_staging.py
    - step:
        name: Deploy to production
        trigger: manual
        script:
          - python deploy_to_production.py

Bạn cần nhớ trigger manual này sẽ không được configure ở step đầu tiên

The first step of your bitbucket-pipelines.yml cannot be manual. Please remove this trigger from your first step or consider using custom pipelines

Links
https://bitbucket.org/blog/pipelines-manual-steps-confidence-deployment-pipeline

7) Save files/folders inside any step and reuse them again at other steps with the bitbucket pipeline

Bạn sẽ thấy pipeline này chúng ta có 2 step.

Nếu step 1 bạn make ra 1 file gì đó và bạn muốn sử dụng file đó step thứ 2 thì bạn phải sử dụng tính năng artifacts

Bạn có thểm tham khảo bài biết bên dưới.
https://support.atlassian.com/bitbucket-cloud/docs/use-artifacts-in-steps/

8) Using an individual image for each step in bitbucket pipeline

You can specify a picture for each step. Like that:

pipelines:
  default:
    - step:
        name: Build and test
        image: node:8.6
        script:
          - npm install
          - npm test
          - npm run build
        artifacts:
          - dist/**
    - step:
        name: Deploy
        image: python:3.5.1
        trigger: manual
        script:
          - python deploy.py

nhưng có điểm là mình phải sử dụng public image
Mình chưa tìm thử sử dụng private image thì như thế nào

9) Multi lines in a comand of script

Đôi khi bạn cần tạo 1 command mà có xuống dòng và enter.

https://tantingli.medium.com/quick-pro-tips-for-bitbucket-pipeline-b4dda132ee3f

image: node:10.15.0
test: &test
    name: Install and Test
    script:
      - cd my-tools
      - npm install
      - npm test
      - npm pack
    artifacts: # defining the artifacts to be passed to each future step.
      # - dist/**
      # - folder/files*.txt
      - my-tools/my-tools-*.tgz
upload: &upload
    name: Upload to S3
    image: your-own-aws-deployment-image:latest
    script:
      - cd my-tools
      - echo "$BITBUCKET_BRANCH"
      - |
        if [ -z "$AWS_ACCESS_KEY_ID" ]; then
          echo "AWS credentials not found; skipping deployment...";
        elif [ -z "$BITBUCKET_BRANCH" ]; then
          echo "Current branch not found; skipping deployment...";
        elif [[ "$BITBUCKET_BRANCH" != dev ]] && [[ "$BITBUCKET_BRANCH" != master ]]; then
          echo "Current branch does not appear to be a valid environment; skipping deployment...";
        else
          echo "all good, ready to upload to s3";
          for f in `ls my-tools-*.tgz`; do
            aws s3 cp "$f" "s3://${MY_TOOLS_BUCKET}/"
          done
        fi
pipelines:
  default:
    - step:
        <<: *test
  branches:
    master:
      - step:
          <<: *test
      - step:
          <<: *upload
          deployment: production
    dev:
      - step:
          <<: *test
      - step:
          <<: *upload
          deployment: dev

10) Run the command: “docker run …” on bitbucket-pipeline

Để run command “docker run ….” trên bitbucket pipeline thì các bạn cần thêm cái sau:

pipelines:
  branches:
    master:
      - step:
          name: prepare Aws Credentials
          script:
            - AWS_CREDENTIAL_FILE=credentials
            - echo "[default]"                              > $AWS_CREDENTIAL_FILE
            - echo "aws_access_key_id=${AWS_ACCESS_KEY_ID}"         >> $AWS_CREDENTIAL_FILE
            - echo "aws_secret_access_key=${AWS_SECRET_ACCESS_KEY}" >> $AWS_CREDENTIAL_FILE
          artifacts:
            - credentials
      - step:
          name: Build image
          script:
            - |
              docker run \
              -v $BITBUCKET_CLONE_DIR/credentials:/root/.aws/credentials \
              -v $BITBUCKET_CLONE_DIR:/workspace \
              gcr.io/kaniko-project/executor:debug \
              --dockerfile=/workspace/coding/nim-commit/Dockerfile \
              --context=/workspace/coding/nim-commit \
              --destination=250887682577.dkr.ecr.us-east-1.amazonaws.com/nim-commit:$BITBUCKET_BUILD_NUMBER \
              --use-new-run
          services:
              - docker

11) Reuse code in bitbucket-pipeline.

11.1) Reuse scripts

definitions:
  scripts:
    - script: &script-build-and-test |-
        yarn
        yarn test

pipelines:
  branches:
    develop:
      - step:
          name: Build and test and deploy
          script:
            - export NODE_ENV=develop
            - *script-build-and-test

11.2) Reuse scripts

https://blog.duyet.net/2021/08/bitbucket-pipelines-notes.html

End) summarize many formulas in bitbucket pipeline

MÌnh thấy có page này tổng hợp khá nhiều kiểu khai báo trong bitbucket pipeline

https://balajisblog.com/cheatsheet-for-bitbucket-pipelines/

https://github.com/miso-belica/playground/blob/main/cheatsheets/bitbucket-pipelines.yml

https://blog.programster.org/bitbucket-pipeline-cheatsheet

Bitbucket, Git

Post navigation

Previous Post: [kubectl/helm] invalid apiVersion “client.authentication.k8s.io/v1alpha1”
Next Post: [AWS] VPC PEERING – Connecting between other VPCs.

More Related Articles

Using curl to download a specific file on github Git
[Kaniko/Bitbucket/ECR] Accomplish the workflow: CI by bitbucket pipeline, Kaniko build image and push image to ECR AWS - Amazon Web Service
[Github] How to fix “Large files detected. You may want to try Git Large File Storage” Git
[Bitbucket-pipelines] Lesson 2: Using SCP to Transport Artifacts from the Build and SSH to Server to run the commands. Bitbucket
[git-sync] an auto simple that pulls a git repository into a container on Kubernetes Git
[Bitbucket Pipeline] Design bitbucket-pipeline and eksctl to upgrade EKS cluster AWS - Amazon Web Service

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Tham Gia Group DevOps nhé!
Để Nim có nhiều động lực ra nhiều bài viết.
Để nhận được những thông báo mới nhất.

Recent Posts

  • Experiences for IP Addresses Shortage on EKS Clusters March 29, 2023
  • [Talisman] Discover the sensitive information in your code. March 28, 2023
  • [Prometheus/Grafana] Install Prometheus and Grafana on ubuntu. March 27, 2023
  • [Kong Gateway] WebSocket connection failed March 26, 2023
  • [Nextcloud] Can’t download files to have a size bigger than 2Gi on NextCloud – RaspBerry March 24, 2023

Archives

  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021

Categories

  • BareMetal
    • NextCloud
  • CI/CD
    • ArgoCD
    • ArgoWorkflows
    • Git
      • Bitbucket
    • Harbor
    • Jenkins
    • Spinnaker
    • TeamCity
  • Coding
    • Golang
    • Jquery & JavaScript
    • Laravel
    • Python
    • Selenium
    • Terraform
      • AWS – Amazon Web Service
      • GCP – Google Cloud
  • Kubernetes & Container
    • Apache Kafka
      • Kafka
      • Kafka Connect
      • Lenses
    • Docker
    • Helm Chart
    • Isito-EnvoyFilter
    • Kong Gateway
    • Kubernetes
      • Ingress
    • Longhorn – Storage
    • MetalLB
    • Vault
    • VictoriaMetrics
  • Log & Monitor
    • DataDog
    • ELK
      • Kibana
      • Logstash
    • Grafana
    • Prometheus
  • Uncategorized
  • Admin

Copyright © 2023 NimTechnology.