Skip to content

Commit

Permalink
[enhancement][docker]update kafka docker file for branch-2.0" (#43180)
Browse files Browse the repository at this point in the history
### What problem does this PR solve?
<!--
You need to clearly describe your PR in this part:

1. What problem was fixed (it's best to include specific error reporting
information). How it was fixed.
2. Which behaviors were modified. What was the previous behavior, what
is it now, why was it modified, and what possible impacts might there
be.
3. What features were added. Why this function was added.
4. Which codes were refactored and why this part of the code was
refactored.
5. Which functions were optimized and what is the difference before and
after the optimization.

The description of the PR needs to enable reviewers to quickly and
clearly understand the logic of the code modification.
-->

<!--
If there are related issues, please fill in the issue number.
- If you want the issue to be closed after the PR is merged, please use
"close #12345". Otherwise, use "ref #12345"
-->
Issue Number: close #xxx

<!--
If this PR is followup a preivous PR, for example, fix the bug that
introduced by a related PR,
link the PR here
-->
Related PR: #xxx

Problem Summary:
Fix the routine load test case on branch-2.0 by changing the Kafka topic
data producer method. In the new approach, the routine load test case
will produce Kafka topic data in the regression-test script instead of
using run-thirdparties-docker.sh.

### Check List (For Committer)

- Test <!-- At least one of them must be included. -->

    - [x] Regression test
    - [ ] Unit Test
    - [ ] Manual test (add detailed scripts or steps below)
    - [ ] No need to test or manual test. Explain why:
- [ ] This is a refactor/code format and no logic has been changed.
        - [ ] Previous test can cover this change.
        - [ ] No colde files have been changed.
        - [ ] Other reason <!-- Add your reason?  -->

- Behavior changed:

    - [x] No.
    - [ ] Yes. <!-- Explain the behavior change -->

- Does this need documentation?

    - [x] No.
- [ ] Yes. <!-- Add document PR link here. eg:
apache/doris-website#1214 -->

- Release note

    <!-- bugfix, feat, behavior changed need a release note -->
    <!-- Add one line release note for this PR. -->
    None

### Check List (For Reviewer who merge this PR)

- [ ] Confirm the release note
- [ ] Confirm test cases
- [ ] Confirm document
- [ ] Add branch pick label <!-- Add branch pick label that this PR
should merge into -->

---------

Co-authored-by: 胥剑旭 <[email protected]>
  • Loading branch information
XuJianxu and 胥剑旭 authored Nov 5, 2024
1 parent d700134 commit 463b19a
Show file tree
Hide file tree
Showing 29 changed files with 1,496 additions and 381 deletions.
28 changes: 10 additions & 18 deletions docker/thirdparties/run-thirdparties-docker.sh
Original file line number Diff line number Diff line change
Expand Up @@ -261,33 +261,25 @@ if [[ "${RUN_KAFKA}" -eq 1 ]]; then
sed -i "s/doris--/${CONTAINER_UID}/g" "${ROOT}"/docker-compose/kafka/kafka.yaml
sed -i "s/localhost/${IP_HOST}/g" "${ROOT}"/docker-compose/kafka/kafka.yaml
sudo docker compose -f "${ROOT}"/docker-compose/kafka/kafka.yaml --env-file "${ROOT}"/docker-compose/kafka/kafka.env down
start_kafka_producers() {
local container_id="$1"
local ip_host="$2"

declare -a topics=("basic_data" "basic_array_data" "basic_data_with_errors" "basic_array_data_with_errors" "basic_data_timezone" "basic_array_data_timezone" "multi_table_csv" "multi_table_csv1")
create_kafka_topics() {
local container_id="$1"
local ip_host="$2"
local backup_dir=/home/work/pipline/backup_center

declare -a topics=("basic_data" "basic_array_data" "basic_data_with_errors" "basic_array_data_with_errors" "basic_data_timezone" "basic_array_data_timezone")

for topic in "${topics[@]}"; do
while IFS= read -r line; do
docker exec "${container_id}" bash -c "echo '$line' | /opt/kafka/bin/kafka-console-producer.sh --broker-list '${ip_host}:19193' --topic '${topic}'"
done < "${ROOT}/docker-compose/kafka/scripts/${topic}.csv"
echo "docker exec "${container_id}" bash -c echo '/opt/kafka/bin/kafka-topics.sh --create --broker-list '${ip_host}:19193' --partitions 10' --topic '${topic}'"
docker exec "${container_id}" bash -c "/opt/kafka/bin/kafka-topics.sh --create --broker-list '${ip_host}:19193' --partitions 10' --topic '${topic}'"
done

declare -a json_topics=("basic_data_json" "basic_array_data_json" "basic_array_data_json_by_line" "basic_data_json_by_line" "multi_table_json" "multi_table_json1")

for json_topic in "${json_topics[@]}"; do
echo ${json_topics}
while IFS= read -r json_line; do
docker exec "${container_id}" bash -c "echo '$json_line' | /opt/kafka/bin/kafka-console-producer.sh --broker-list '${ip_host}:19193' --topic '${json_topic}'"
echo "echo '$json_line' | /opt/kafka/bin/kafka-console-producer.sh --broker-list '${ip_host}:19193' --topic '${json_topic}'"
done < "${ROOT}/docker-compose/kafka/scripts/${json_topic}.json"
done
}

if [[ "${STOP}" -ne 1 ]]; then
sudo docker compose -f "${ROOT}"/docker-compose/kafka/kafka.yaml --env-file "${ROOT}"/docker-compose/kafka/kafka.env up --build --remove-orphans -d
sleep 30s
start_kafka_producers "${KAFKA_CONTAINER_ID}" "${IP_HOST}"
sleep 10s
create_kafka_topics "${KAFKA_CONTAINER_ID}" "${IP_HOST}"
fi
fi

Expand Down
5 changes: 5 additions & 0 deletions regression-test/framework/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -216,6 +216,11 @@ under the License.
<version>${groovy.version}</version>
<type>pom</type>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>2.8.1</version>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-api</artifactId>
Expand Down

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
[{"k00": "2", "k01": "[0, 0, 0, 0, 0, 0]", "k02": "[117, 117, 117, 117, 117, 117]", "k03": "[-4744, -4744, -4744, -4744, -4744, -4744]", "k04": "[-1593211961, -1593211961, -1593211961, -1593211961, -1593211961, -1593211961]", "k05": "[-3869640069299678780, -3869640069299678780, -3869640069299678780, -3869640069299678780, -3869640069299678780, -3869640069299678780]", "k06": "[8491817458398170567, 8491817458398170567, 8491817458398170567, 8491817458398170567, 8491817458398170567, 8491817458398170567]", "k07": "[-30948.857, -30948.857, -30948.857, -30948.857, -30948.857]", "k08": "[804341131.229905, 804341131.229905, 804341131.229905, 804341131.229905, 804341131.229905, 804341131.229905]", "k09": "[-74019648, -74019648, -74019648, -74019648, -74019648, -74019648]", "k10": "[13024168, 13024168, 13024168, 13024168, 13024168, 13024168]", "k11": "[2023-08-22, 2023-08-22, 2023-08-22, 2023-08-22, 2023-08-22, 2023-08-22]", "k12": "[2022-09-30 07:47:12, 2022-09-30 07:47:12, 2022-09-30 07:47:12, 2022-09-30 07:47:12, 2022-09-30 07:47:12, 2022-09-30 07:47:12]", "k13": "[2023-04-21, 2023-04-21, 2023-04-21, 2023-04-21, 2023-04-21, 2023-04-21]", "k14": "[2022-11-24 15:07:56, 2022-11-24 15:07:56, 2022-11-24 15:07:56, 2022-11-24 15:07:56, 2022-11-24 15:07:56, 2022-11-24 15:07:56]", "k15": "['g', 'g', 'g', 'g', 'g', 'g']", "k16": "['a', 'a', 'a', 'a', 'a', 'a']", "k17": "['S9JEYFrLN4zr1vX1yPUE6ovSX431nJdCuttpBUOVMrp844vBfHStO7laHNc5sI9MehAi8GbGDGV3t322DPMy7SBlquU5D7jsGISMNpX4IWbn3Yrsl', 'S9JEYFrLN4zr1vX1yPUE6ovSX431nJdCuttpBUOVMrp844vBfHStO7laHNc5sI9MehAi8GbGDGV3t322DPMy7SBlquU5D7jsGISMNpX4IWbn3Yrsl', 'S9JEYFrLN4zr1vX1yPUE6ovSX431nJdCuttpBUOVMrp844vBfHStO7laHNc5sI9MehAi8GbGDGV3t322DPMy7SBlquU5D7jsGISMNpX4IWbn3Yrsl', 'S9JEYFrLN4zr1vX1yPUE6ovSX431nJdCuttpBUOVMrp844vBfHStO7laHNc5sI9MehAi8GbGDGV3t322DPMy7SBlquU5D7jsGISMNpX4IWbn3Yrsl', 'S9JEYFrLN4zr1vX1yPUE6ovSX431nJdCuttpBUOVMrp844vBfHStO7laHNc5sI9MehAi8GbGDGV3t322DPMy7SBlquU5D7jsGISMNpX4IWbn3Yrsl', 'S9JEYFrLN4zr1vX1yPUE6ovSX431nJdCuttpBUOVMrp844vBfHStO7laHNc5sI9MehAi8GbGDGV3t322DPMy7SBlquU5D7jsGISMNpX4IWbn3Yrsl']"}]
Loading

0 comments on commit 463b19a

Please sign in to comment.