Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prepare for conductor based nighty builds #32807

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

chouquette
Copy link
Contributor

@chouquette chouquette commented Jan 9, 2025

What does this PR do?

This PR prepares our CI for nightly builds being initiated by conductor.

Motivation

We currently have 2 nightly builds that run concurrently.

  • One from the regular gitlab scheduled pipeline
  • One initiated by a conductor schedule.i

This PR prepares for dropping the gitlab scheduled pipeline.

Describe how you validated your changes

I suppose we'll have to test this live, at the risk of missing a nightly build.
In the worst case, the scheduled pipeline drop can easily be reverted since it's configured through terraform.

Possible Drawbacks / Trade-offs

Additional Notes

Relates to https://github.com/DataDog/gitlab-config/pull/270

@chouquette chouquette added changelog/no-changelog qa/no-code-change No code change in Agent code requiring validation team/agent-delivery labels Jan 9, 2025
@chouquette chouquette requested review from a team as code owners January 9, 2025 10:48
@github-actions github-actions bot added the short review PR is simple enough to be reviewed quickly label Jan 9, 2025
@agent-platform-auto-pr
Copy link
Contributor

agent-platform-auto-pr bot commented Jan 9, 2025

[Fast Unit Tests Report]

On pipeline 52544041 (CI Visibility). The following jobs did not run any unit tests:

Jobs:
  • tests_deb-arm64-py3
  • tests_deb-x64-py3
  • tests_flavor_dogstatsd_deb-x64
  • tests_flavor_heroku_deb-x64
  • tests_flavor_iot_deb-x64
  • tests_rpm-arm64-py3
  • tests_rpm-x64-py3
  • tests_windows-x64

If you modified Go files and expected unit tests to run in these jobs, please double check the job logs. If you think tests should have been executed reach out to #agent-devx-help

@agent-platform-auto-pr
Copy link
Contributor

agent-platform-auto-pr bot commented Jan 9, 2025

Uncompressed package size comparison

Comparison with ancestor 3fa0ffda0c223fc10e85f1a3b59f120ee11de58c

Diff per package
package diff status size ancestor threshold
datadog-agent-x86_64-rpm 0.02MB ⚠️ 1023.21MB 1023.19MB 0.50MB
datadog-agent-x86_64-suse 0.02MB ⚠️ 1023.21MB 1023.19MB 0.50MB
datadog-agent-amd64-deb 0.02MB ⚠️ 1013.89MB 1013.87MB 0.50MB
datadog-agent-aarch64-rpm 0.01MB ⚠️ 952.38MB 952.37MB 0.50MB
datadog-agent-arm64-deb 0.01MB ⚠️ 943.08MB 943.07MB 0.50MB
datadog-heroku-agent-amd64-deb 0.00MB 506.17MB 506.17MB 0.50MB
datadog-dogstatsd-amd64-deb 0.00MB 58.63MB 58.63MB 0.50MB
datadog-dogstatsd-x86_64-rpm 0.00MB 58.71MB 58.71MB 0.50MB
datadog-dogstatsd-x86_64-suse 0.00MB 58.71MB 58.71MB 0.50MB
datadog-dogstatsd-arm64-deb 0.00MB 56.14MB 56.14MB 0.50MB
datadog-iot-agent-amd64-deb 0.00MB 113.82MB 113.82MB 0.50MB
datadog-iot-agent-x86_64-rpm 0.00MB 113.89MB 113.89MB 0.50MB
datadog-iot-agent-x86_64-suse 0.00MB 113.89MB 113.89MB 0.50MB
datadog-iot-agent-arm64-deb 0.00MB 109.26MB 109.26MB 0.50MB
datadog-iot-agent-aarch64-rpm 0.00MB 109.33MB 109.33MB 0.50MB

Decision

⚠️ Warning

Copy link

cit-pr-commenter bot commented Jan 9, 2025

Regression Detector

Regression Detector Results

Metrics dashboard
Target profiles
Run ID: 10a6059f-544c-4f80-b12e-af71d17e4ca5

Baseline: 3fa0ffd
Comparison: aa27304
Diff

Optimization Goals: ✅ No significant changes detected

Fine details of change detection per experiment

perf experiment goal Δ mean % Δ mean % CI trials links
quality_gate_logs % cpu utilization +1.57 [-1.68, +4.82] 1 Logs
tcp_syslog_to_blackhole ingress throughput +1.38 [+1.28, +1.48] 1 Logs
quality_gate_idle memory utilization +0.36 [+0.32, +0.40] 1 Logs bounds checks dashboard
file_to_blackhole_0ms_latency_http2 egress throughput +0.02 [-0.83, +0.87] 1 Logs
file_to_blackhole_100ms_latency egress throughput +0.01 [-0.67, +0.69] 1 Logs
tcp_dd_logs_filter_exclude ingress throughput +0.00 [-0.01, +0.01] 1 Logs
file_to_blackhole_0ms_latency_http1 egress throughput +0.00 [-0.86, +0.86] 1 Logs
uds_dogstatsd_to_api ingress throughput +0.00 [-0.13, +0.13] 1 Logs
file_to_blackhole_300ms_latency egress throughput -0.00 [-0.64, +0.64] 1 Logs
file_to_blackhole_500ms_latency egress throughput -0.05 [-0.83, +0.74] 1 Logs
file_to_blackhole_0ms_latency egress throughput -0.05 [-0.90, +0.79] 1 Logs
file_to_blackhole_1000ms_latency_linear_load egress throughput -0.13 [-0.59, +0.34] 1 Logs
quality_gate_idle_all_features memory utilization -0.14 [-0.23, -0.05] 1 Logs bounds checks dashboard
file_to_blackhole_1000ms_latency egress throughput -0.28 [-1.05, +0.50] 1 Logs
file_tree memory utilization -0.46 [-0.60, -0.33] 1 Logs
uds_dogstatsd_to_api_cpu % cpu utilization -0.62 [-1.29, +0.06] 1 Logs

Bounds Checks: ✅ Passed

perf experiment bounds_check_name replicates_passed links
file_to_blackhole_0ms_latency lost_bytes 10/10
file_to_blackhole_0ms_latency memory_usage 10/10
file_to_blackhole_0ms_latency_http1 lost_bytes 10/10
file_to_blackhole_0ms_latency_http1 memory_usage 10/10
file_to_blackhole_0ms_latency_http2 lost_bytes 10/10
file_to_blackhole_0ms_latency_http2 memory_usage 10/10
file_to_blackhole_1000ms_latency memory_usage 10/10
file_to_blackhole_1000ms_latency_linear_load memory_usage 10/10
file_to_blackhole_100ms_latency lost_bytes 10/10
file_to_blackhole_100ms_latency memory_usage 10/10
file_to_blackhole_300ms_latency lost_bytes 10/10
file_to_blackhole_300ms_latency memory_usage 10/10
file_to_blackhole_500ms_latency lost_bytes 10/10
file_to_blackhole_500ms_latency memory_usage 10/10
quality_gate_idle memory_usage 10/10 bounds checks dashboard
quality_gate_idle_all_features memory_usage 10/10 bounds checks dashboard
quality_gate_logs lost_bytes 10/10
quality_gate_logs memory_usage 10/10

Explanation

Confidence level: 90.00%
Effect size tolerance: |Δ mean %| ≥ 5.00%

Performance changes are noted in the perf column of each table:

  • ✅ = significantly better comparison variant performance
  • ❌ = significantly worse comparison variant performance
  • ➖ = no significant change in performance

A regression test is an A/B test of target performance in a repeatable rig, where "performance" is measured as "comparison variant minus baseline variant" for an optimization goal (e.g., ingress throughput). Due to intrinsic variability in measuring that goal, we can only estimate its mean value for each experiment; we report uncertainty in that value as a 90.00% confidence interval denoted "Δ mean % CI".

For each experiment, we decide whether a change in performance is a "regression" -- a change worth investigating further -- if all of the following criteria are true:

  1. Its estimated |Δ mean %| ≥ 5.00%, indicating the change is big enough to merit a closer look.

  2. Its 90.00% confidence interval "Δ mean % CI" does not contain zero, indicating that if our statistical model is accurate, there is at least a 90.00% chance there is a difference in performance between baseline and comparison variants.

  3. Its configuration does not mark it "erratic".

CI Pass/Fail Decision

Passed. All Quality Gates passed.

  • quality_gate_logs, bounds check memory_usage: 10/10 replicas passed. Gate passed.
  • quality_gate_logs, bounds check lost_bytes: 10/10 replicas passed. Gate passed.
  • quality_gate_idle_all_features, bounds check memory_usage: 10/10 replicas passed. Gate passed.
  • quality_gate_idle, bounds check memory_usage: 10/10 replicas passed. Gate passed.

@@ -12,7 +12,7 @@ internal_kubernetes_deploy_experimental:
when: always
- if: $CI_COMMIT_BRANCH != "main"
when: never
- if: $DDR != "true"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does it mean DDR will not be defined anymore? If it's the case could you update this condition as well in this PR?
Thanks in advance

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is still defined, the intent is just to stop basing our logic on 2 different variables depending on the location.

I wasn't sure what to do regarding the check you pointed at, in the current setup, reading one, the other, or both is equivalent, but we might want to remove the reference to DDR from there as well

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes I think it's better to be aligned, so if you can embed the modification which checks only the value of DD_WORKFLOW_ID in this line it would be great!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done!

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot!

@github-actions github-actions bot added medium review PR review might take time and removed short review PR is simple enough to be reviewed quickly labels Jan 9, 2025
@chouquette chouquette requested a review from a team as a code owner January 9, 2025 13:56
@agent-platform-auto-pr
Copy link
Contributor

Gitlab CI Configuration Changes

Modified Jobs

.if_scheduled_main
  .if_scheduled_main:
-   if: $CI_PIPELINE_SOURCE == "schedule" && $CI_COMMIT_BRANCH == "main"
+   if: ($CI_PIPELINE_SOURCE == "schedule" || ($DDR_WORKFLOW_ID != null && $APPS =~
+     /^beta-build-/)) && $CI_COMMIT_BRANCH == "main"
.on_scheduled_main
  .on_scheduled_main:
- - if: $CI_PIPELINE_SOURCE == "schedule" && $CI_COMMIT_BRANCH == "main"
+ - if: ($CI_PIPELINE_SOURCE == "schedule" || ($DDR_WORKFLOW_ID != null && $APPS =~
+     /^beta-build-/)) && $CI_COMMIT_BRANCH == "main"
.on_scheduled_main_or_manual
  .on_scheduled_main_or_manual:
- - if: $CI_PIPELINE_SOURCE == "schedule" && $CI_COMMIT_BRANCH == "main"
+ - if: ($CI_PIPELINE_SOURCE == "schedule" || ($DDR_WORKFLOW_ID != null && $APPS =~
+     /^beta-build-/)) && $CI_COMMIT_BRANCH == "main"
    when: always
  - allow_failure: true
    when: manual
build_processed_btfhub_archive
  build_processed_btfhub_archive:
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/btf-gen$DATADOG_AGENT_BTF_GEN_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BTF_GEN_BUILDIMAGES
    rules:
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
-   - if: $CI_PIPELINE_SOURCE == "schedule" && $CI_COMMIT_BRANCH == "main"
+   - if: ($CI_PIPELINE_SOURCE == "schedule" || ($DDR_WORKFLOW_ID != null && $APPS =~
+       /^beta-build-/)) && $CI_COMMIT_BRANCH == "main"
      when: always
    - allow_failure: true
      when: manual
    script:
    - inv -e system-probe.process-btfhub-archive --branch $BTFHUB_ARCHIVE_BRANCH
    - $S3_CP_CMD btfs-x86_64.tar $S3_DD_AGENT_OMNIBUS_BTFS_URI/$BTFHUB_ARCHIVE_BRANCH/btfs-x86_64.tar
      --grants read=uri=http://acs.amazonaws.com/groups/global/AllUsers
    - $S3_CP_CMD btfs-arm64.tar $S3_DD_AGENT_OMNIBUS_BTFS_URI/$BTFHUB_ARCHIVE_BRANCH/btfs-arm64.tar
      --grants read=uri=http://acs.amazonaws.com/groups/global/AllUsers
    stage: deps_build
    tags:
    - arch:amd64
    variables:
      KUBERNETES_CPU_REQUEST: 32
generate_windows_gitlab_runner_bump_pr
  generate_windows_gitlab_runner_bump_pr:
    image: registry.ddbuild.io/slack-notifier:v27936653-9a2a7db-sdm-gbi-jammy@sha256:c9d1145319d1904fa72ea97904a15200d3cb684324723f9e1700bc02cc85065c
    needs:
    - trigger_auto_staging_release
    rules:
-   - if: $DDR == "true"
+   - if: $DDR_WORKFLOW_ID != null
      when: never
    - if: $CI_COMMIT_TAG =~ /^[0-9]+\.[0-9]+\.[0-9]+-v[0-9]+\.[0-9]+\.[0-9]+(-rc\.[0-9]+){0,1}$/
      when: never
    - if: $CI_COMMIT_TAG =~ /^[0-9]+\.[0-9]+\.[0-9]+(-rc\.[0-9]+)?$/
    script:
    - 'GITHUB_KEY_B64=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $MACOS_GITHUB_APP_2
      key_b64) || exit $?; export GITHUB_KEY_B64
  
      GITHUB_APP_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $MACOS_GITHUB_APP_2 app_id)
      || exit $?; export GITHUB_APP_ID
  
      GITHUB_INSTALLATION_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $MACOS_GITHUB_APP_2
      installation_id) || exit $?; export GITHUB_INSTALLATION_ID
  
      echo "Using GitHub App instance 2"
  
      '
    - python3 -m pip install -r requirements.txt -r tasks/libs/requirements-notifications.txt
    - $S3_CP_CMD $S3_ARTIFACTS_URI/agent-version.cache .
    - inv -e github.update-windows-runner-version
    stage: trigger_release
    tags:
    - arch:amd64
internal_kubernetes_deploy_experimental
  internal_kubernetes_deploy_experimental:
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/deb_x64$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    needs:
    - artifacts: false
      job: docker_trigger_internal
    - artifacts: false
      job: docker_trigger_internal-ot
    - artifacts: false
      job: docker_trigger_cluster_agent_internal
    - artifacts: false
      job: docker_build_agent7_windows1809
    - artifacts: false
      job: docker_build_agent7_windows2022
    - artifacts: false
      job: docker_build_agent7_windows1809_jmx
    - artifacts: false
      job: docker_build_agent7_windows2022_jmx
    - artifacts: false
      job: docker_build_agent7_windows1809_core
    - artifacts: false
      job: docker_build_agent7_windows2022_core
    - artifacts: false
      job: docker_build_agent7_windows1809_core_jmx
    - artifacts: false
      job: docker_build_agent7_windows2022_core_jmx
    - artifacts: false
      job: k8s-e2e-main
      optional: true
    rules:
    - if: $FORCE_K8S_DEPLOYMENT == "true"
      when: always
    - if: $CI_COMMIT_BRANCH != "main"
      when: never
-   - if: $DDR != "true"
+   - if: $DDR_WORKFLOW_ID == null
      when: never
    - if: $APPS !~ "/^datadog-agent/"
      when: never
    - if: $DEPLOY_AGENT == "true" || $DDR_WORKFLOW_ID != null
    script:
    - GITLAB_TOKEN=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $GITLAB_TOKEN write_api)
      || exit $?; export GITLAB_TOKEN
    - inv pipeline.trigger-child-pipeline --project-name DataDog/k8s-datadog-agent-ops
      --git-ref main --variable OPTION_AUTOMATIC_ROLLOUT --variable EXPLICIT_WORKFLOWS
      --variable OPTION_PRE_SCRIPT --variable SKIP_PLAN_CHECK --variable APPS --variable
      BAZEL_TARGET --variable DDR --variable DDR_WORKFLOW_ID --variable TARGET_ENV --variable
      DYNAMIC_BUILD_RENDER_TARGET_FORWARD_PARAMETERS --variable BUNDLE_VERSION_OVERRIDE
    stage: internal_kubernetes_deploy
    tags:
    - arch:amd64
    variables:
      BUNDLE_VERSION_OVERRIDE: v${CI_PIPELINE_ID}-${CI_COMMIT_SHORT_SHA}
      EXPLICIT_WORKFLOWS: //workflows:beta_builds.agents_nightly.staging-deploy.publish,//workflows:beta_builds.agents_nightly.staging-validate.publish,//workflows:beta_builds.agents_nightly.prod-wait-business-hours.publish,//workflows:beta_builds.agents_nightly.prod-deploy.publish,//workflows:beta_builds.agents_nightly.prod-validate.publish,//workflows:beta_builds.agents_nightly.publish-image-confirmation.publish
      OPTION_AUTOMATIC_ROLLOUT: 'true'
      OPTION_PRE_SCRIPT: patch-cluster-images-operator.sh env=ci ${CI_COMMIT_REF_SLUG}-ot-beta-jmx-${CI_COMMIT_SHORT_SHA}
        ${CI_COMMIT_REF_SLUG}-${CI_COMMIT_SHORT_SHA}
      SKIP_PLAN_CHECK: 'true'
notify-slack
  notify-slack:
    image: registry.ddbuild.io/slack-notifier:v27936653-9a2a7db-sdm-gbi-jammy@sha256:c9d1145319d1904fa72ea97904a15200d3cb684324723f9e1700bc02cc85065c
    needs:
    - internal_kubernetes_deploy_experimental
    rules:
    - if: $FORCE_K8S_DEPLOYMENT == "true"
      when: always
    - if: $CI_COMMIT_BRANCH != "main"
      when: never
-   - if: $DDR != "true"
+   - if: $DDR_WORKFLOW_ID == null
      when: never
    - if: $APPS !~ "/^datadog-agent/"
      when: never
    - if: $DEPLOY_AGENT == "true" || $DDR_WORKFLOW_ID != null
    script:
    - export SDM_JWT=$(vault read -field=token identity/oidc/token/sdm)
    - python3 -m pip install -r tasks/requirements.txt
    - inv pipeline.changelog ${CI_COMMIT_SHORT_SHA} || exit $?
    stage: internal_kubernetes_deploy
    tags:
    - arch:amd64
single_machine_performance-nightly-amd64-a7
  single_machine_performance-nightly-amd64-a7:
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/docker_x64$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    needs:
    - docker_build_agent7
    rules:
-   - if: $CI_PIPELINE_SOURCE == "schedule" && $CI_COMMIT_BRANCH == "main"
+   - if: ($CI_PIPELINE_SOURCE == "schedule" || ($DDR_WORKFLOW_ID != null && $APPS =~
+       /^beta-build-/)) && $CI_COMMIT_BRANCH == "main"
    script:
    - GITLAB_TOKEN=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $GITLAB_TOKEN write_api)
      || exit $?; export GITLAB_TOKEN
    - "if [[ \"$BUCKET_BRANCH\" == \"nightly\" && ( \"$IMG_SOURCES\" =~ \"$SRC_AGENT\"\
      \ || \"$IMG_SOURCES\" =~ \"$SRC_DCA\" || \"$IMG_SOURCES\" =~ \"$SRC_CWS_INSTRUMENTATION\"\
      \ || \"$IMG_VARIABLES\" =~ \"$SRC_AGENT\" || \"$IMG_VARIABLES\" =~ \"$SRC_DCA\"\
      \ || \"$IMG_VARIABLES\" =~ \"$SRC_CWS_INSTRUMENTATION\" ) ]]; then\n  export ECR_RELEASE_SUFFIX=\"\
      -nightly\"\nelse\n  export ECR_RELEASE_SUFFIX=\"${CI_COMMIT_TAG+-release}\"\n\
      fi\n"
    - IMG_VARIABLES="$(sed -E "s#(${SRC_AGENT}|${SRC_DSD}|${SRC_DCA}|${SRC_CWS_INSTRUMENTATION})#\1${ECR_RELEASE_SUFFIX}#g"
      <<<"$IMG_VARIABLES")"
    - IMG_SOURCES="$(sed -E "s#(${SRC_AGENT}|${SRC_DSD}|${SRC_DCA}|${SRC_CWS_INSTRUMENTATION})#\1${ECR_RELEASE_SUFFIX}#g"
      <<<"$IMG_SOURCES")"
    - inv pipeline.trigger-child-pipeline --project-name DataDog/public-images --git-ref
      main --timeout 1800 --variable IMG_VARIABLES --variable IMG_REGISTRIES --variable
      IMG_SOURCES --variable IMG_DESTINATIONS --variable IMG_SIGNING --variable APPS
      --variable BAZEL_TARGET --variable DDR --variable DDR_WORKFLOW_ID --variable TARGET_ENV
      --variable DYNAMIC_BUILD_RENDER_TARGET_FORWARD_PARAMETERS
    stage: dev_container_deploy
    tags:
    - arch:amd64
    variables:
      IMG_DESTINATIONS: 08450328-agent:nightly-${CI_COMMIT_BRANCH}-${CI_COMMIT_SHA}-7-amd64
      IMG_REGISTRIES: internal-aws-smp
      IMG_SIGNING: ''
      IMG_SOURCES: ${SRC_AGENT}:v${CI_PIPELINE_ID}-${CI_COMMIT_SHORT_SHA}-7-amd64
      IMG_VARIABLES: ''
      SRC_AGENT: registry.ddbuild.io/ci/datadog-agent/agent
      SRC_CWS_INSTRUMENTATION: registry.ddbuild.io/ci/datadog-agent/cws-instrumentation
      SRC_DCA: registry.ddbuild.io/ci/datadog-agent/cluster-agent
      SRC_DSD: registry.ddbuild.io/ci/datadog-agent/dogstatsd
trigger_auto_staging_release
  trigger_auto_staging_release:
    dependencies: []
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/deb_x64$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    rules:
-   - if: $DDR == "true"
-     when: never
    - if: $CI_COMMIT_TAG =~ /^[0-9]+\.[0-9]+\.[0-9]+-v[0-9]+\.[0-9]+\.[0-9]+(-rc\.[0-9]+){0,1}$/
      when: never
    - if: $DEPLOY_AGENT == "true" || $DDR_WORKFLOW_ID != null
    script:
    - RELEASE_VERSION="$(inv agent.version --url-safe --omnibus-format)-1" || exit $?;
      export RELEASE_VERSION
    - GITLAB_TOKEN=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $GITLAB_TOKEN write_api)
      || exit $?; export GITLAB_TOKEN
    - inv pipeline.trigger-child-pipeline --project-name "DataDog/agent-release-management"
      --git-ref "main" --variable ACTION --variable AUTO_RELEASE --variable BUILD_PIPELINE_ID
      --variable RELEASE_PRODUCT --variable RELEASE_VERSION --variable TARGET_REPO --variable
      TARGET_REPO_BRANCH $NO_FOLLOW
    stage: trigger_release
    tags:
    - arch:amd64
    variables:
      ACTION: promote
      AUTO_RELEASE: 'true'
      BUILD_PIPELINE_ID: $CI_PIPELINE_ID
      RELEASE_PRODUCT: datadog-agent
      TARGET_REPO: staging
      TARGET_REPO_BRANCH: $BUCKET_BRANCH

Changes Summary

Removed Modified Added Renamed
0 9 0 0

ℹ️ Diff available in the job log.

@@ -12,7 +12,7 @@ internal_kubernetes_deploy_experimental:
when: always
- if: $CI_COMMIT_BRANCH != "main"
when: never
- if: $DDR != "true"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
changelog/no-changelog medium review PR review might take time qa/no-code-change No code change in Agent code requiring validation team/agent-delivery
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants