-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature](parallel-result-sink) support async fetch from multiple backends concurrently #47915
base: master
Are you sure you want to change the base?
Conversation
Thank you for your contribution to Apache Doris. Please clearly describe your PR:
|
run buildall |
run buildall |
run buildall |
run buildall |
run buildall |
run buildall |
run buildall |
1 similar comment
run buildall |
run buildall |
run buildall |
run buildall |
run buildall |
run buildall |
run buildall |
run buildall |
run buildall |
run buildall |
run buildall |
ReceiverContext context = new ReceiverContext(resultReceivers.get(i), i); | ||
contexts.add(context); | ||
} | ||
this.executor = Executors.newFixedThreadPool(resultReceivers.size()); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
how about use a global thread pool with unlimited size?
to save thread creation time
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
and you are not closing the thread pool?
BlockingQueue<Integer> readyOffsets; | ||
int finishedReceivers = 0; | ||
|
||
public ResultReceiverConsumer(List<ResultReceiver> resultReceivers) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I suggest to write a ut for this class
Here is a example written by cursor, I didn't not run it, but may help you
ResultReceiverConsumerTest.java.txt
What problem does this PR solve?
support async fetch from multiple backends concurrently
If we fetch data one by one with the backend in order, the logical deadlock may be triggered due to memory control, and the load on the backend is also unbalanced.
Check List (For Author)
Test
Behavior changed:
Does this need documentation?
Check List (For Reviewer who merge this PR)