Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HADOOP-19354. S3A: S3AInputStream to be created by factory under S3AStore #7214

Open
wants to merge 6 commits into
base: trunk
Choose a base branch
from

Conversation

steveloughran
Copy link
Contributor

@steveloughran steveloughran commented Dec 6, 2024

HADOOP-19354

  • Factory interface with a parameter object creation method
  • Base class AbstractS3AInputStream for all streams to create
  • S3AInputStream subclasses that and has a factory
  • Production and test code to use it
  • Input stream callbacks pushed down to S3Store
  • S3Store to dynamically choose factory at startup, stop in close()
  • S3Store to implement the factory interface, completing final binding operations (callbacks, stats)

How was this patch tested?

S3 london

For code changes:

  • Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')?
  • Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation?
  • If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under ASF 2.0?
  • If applicable, have you updated the LICENSE, LICENSE-binary, NOTICE-binary files?

TODO

VectoredIOContext

VectoredIOContext.build() to freeze setters, add a copy()
method to copy it, which is then used to create the copy
passed down to streams.

(via a private constructor which returns a mutable version)

Stream capabilities

[ ] doc
[ ] add unit and Itests through FS.
[ ] storediag
[ ] bucket-info

IOStats

[ ] thread stats context to be saved in ObjectInputStream

Testing.

[ ] The huge file tests should be tuned so each of the different ones uses a different stream, always.
[ ] use a -Dstream="factory name" to choose factory, rather than the -Dprefetch
[ ] if not set, whatever is in auth-keys gets picked up.
[ ] ConfigurationHelper.resolveEnum() tests
[ ] VectorIO context unit tests for prefetch type

Docs

[ ] stream leaks
[ ] thread IOStats/context resetting

open issues

ITestS3AOpenCost#prefetching probe

@steveloughran
Copy link
Contributor Author

test failure from me pushing disk allocator down into store and test case not setting the store up

tion
[ERROR] testInterruptSimplePut[disk-2](org.apache.hadoop.fs.s3a.scale.ITestS3ABlockOutputStreamInterruption)  Time elapsed: 2.421 s  <<< ERROR!
java.lang.NullPointerException
        at org.apache.hadoop.fs.s3a.impl.ErrorTranslation.maybeExtractChannelException(ErrorTranslation.java:267)
        at org.apache.hadoop.fs.s3a.impl.ErrorTranslation.maybeExtractIOException(ErrorTranslation.java:189)
        at org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:212)
        at org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:124)
        at org.apache.hadoop.fs.s3a.Invoker.lambda$retry$4(Invoker.java:376)
        at org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:468)
        at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:372)
        at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:347)
        at org.apache.hadoop.fs.s3a.WriteOperationHelper.retry(WriteOperationHelper.java:207)
        at org.apache.hadoop.fs.s3a.WriteOperationHelper.putObject(WriteOperationHelper.java:525)
        at org.apache.hadoop.fs.s3a.S3ABlockOutputStream.putObject(S3ABlockOutputStream.java:708)
        at org.apache.hadoop.fs.s3a.S3ABlockOutputStream.close(S3ABlockOutputStream.java:500)
        at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:77)
        at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:106)
        at org.apache.hadoop.test.LambdaTestUtils.intercept(LambdaTestUtils.java:410)
        at org.apache.hadoop.fs.s3a.scale.ITestS3ABlockOutputStreamInterruption.expectCloseInterrupted(ITestS3ABlockOutputStreamInterruption.java:406)
        at org.apache.hadoop.fs.s3a.scale.ITestS3ABlockOutputStreamInterruption.testInterruptSimplePut(ITestS3ABlockOutputStreamInterruption.java:386)
 

@steveloughran steveloughran force-pushed the s3/HADOOP-19354-s3a-inputstream-factory branch from 5a32f16 to 7d76047 Compare December 6, 2024 18:45
@apache apache deleted a comment from hadoop-yetus Jan 1, 2025
@apache apache deleted a comment from hadoop-yetus Jan 1, 2025
@apache apache deleted a comment from hadoop-yetus Jan 1, 2025
@steveloughran steveloughran force-pushed the s3/HADOOP-19354-s3a-inputstream-factory branch from a944b86 to 0f01d61 Compare January 3, 2025 17:39
@steveloughran steveloughran marked this pull request as ready for review January 3, 2025 18:08
@apache apache deleted a comment from hadoop-yetus Jan 3, 2025
@apache apache deleted a comment from hadoop-yetus Jan 3, 2025
@apache apache deleted a comment from hadoop-yetus Jan 3, 2025
@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 50s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 1s No case conflicting files found.
+0 🆗 codespell 0m 1s codespell was not available.
+0 🆗 detsecrets 0m 1s detect-secrets was not available.
+0 🆗 xmllint 0m 1s xmllint was not available.
+0 🆗 markdownlint 0m 0s markdownlint was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 18 new or modified test files.
_ trunk Compile Tests _
+1 💚 mvninstall 39m 58s trunk passed
+1 💚 compile 0m 45s trunk passed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04
+1 💚 compile 0m 35s trunk passed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
+1 💚 checkstyle 0m 33s trunk passed
+1 💚 mvnsite 0m 40s trunk passed
+1 💚 javadoc 0m 41s trunk passed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04
+1 💚 javadoc 0m 33s trunk passed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
+1 💚 spotbugs 1m 8s trunk passed
+1 💚 shadedclient 37m 24s branch has no errors when building and testing our client artifacts.
-0 ⚠️ patch 37m 45s Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary.
_ Patch Compile Tests _
+1 💚 mvninstall 0m 29s the patch passed
+1 💚 compile 0m 36s the patch passed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04
+1 💚 javac 0m 36s the patch passed
+1 💚 compile 0m 27s the patch passed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
+1 💚 javac 0m 27s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
-0 ⚠️ checkstyle 0m 21s /results-checkstyle-hadoop-tools_hadoop-aws.txt hadoop-tools/hadoop-aws: The patch generated 1 new + 25 unchanged - 0 fixed = 26 total (was 25)
+1 💚 mvnsite 0m 31s the patch passed
-1 ❌ javadoc 0m 30s /results-javadoc-javadoc-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.txt hadoop-tools_hadoop-aws-jdkUbuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04 with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04 generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0)
-1 ❌ javadoc 0m 25s /results-javadoc-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.txt hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0)
+1 💚 spotbugs 1m 6s the patch passed
+1 💚 shadedclient 37m 39s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 2m 47s hadoop-aws in the patch passed.
+1 💚 asflicense 0m 36s The patch does not generate ASF License warnings.
130m 4s
Subsystem Report/Notes
Docker ClientAPI=1.47 ServerAPI=1.47 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7214/8/artifact/out/Dockerfile
GITHUB PR #7214
Optional Tests dupname asflicense codespell detsecrets xmllint compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle markdownlint
uname Linux 5978404f578e 5.15.0-124-generic #134-Ubuntu SMP Fri Sep 27 20:20:17 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 0f01d61
Default Java Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7214/8/testReport/
Max. process+thread count 623 (vs. ulimit of 5500)
modules C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7214/8/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@steveloughran steveloughran force-pushed the s3/HADOOP-19354-s3a-inputstream-factory branch from 0f01d61 to e7e454c Compare January 7, 2025 14:36
@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 52s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 1s No case conflicting files found.
+0 🆗 codespell 0m 1s codespell was not available.
+0 🆗 detsecrets 0m 1s detect-secrets was not available.
+0 🆗 xmllint 0m 1s xmllint was not available.
+0 🆗 markdownlint 0m 0s markdownlint was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 18 new or modified test files.
_ trunk Compile Tests _
+1 💚 mvninstall 38m 11s trunk passed
+1 💚 compile 0m 46s trunk passed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04
+1 💚 compile 0m 34s trunk passed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
+1 💚 checkstyle 0m 32s trunk passed
+1 💚 mvnsite 0m 41s trunk passed
+1 💚 javadoc 0m 41s trunk passed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04
+1 💚 javadoc 0m 32s trunk passed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
+1 💚 spotbugs 1m 10s trunk passed
+1 💚 shadedclient 37m 49s branch has no errors when building and testing our client artifacts.
-0 ⚠️ patch 38m 11s Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary.
_ Patch Compile Tests _
+1 💚 mvninstall 0m 32s the patch passed
+1 💚 compile 0m 40s the patch passed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04
+1 💚 javac 0m 40s the patch passed
+1 💚 compile 0m 28s the patch passed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
+1 💚 javac 0m 28s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
-0 ⚠️ checkstyle 0m 20s /results-checkstyle-hadoop-tools_hadoop-aws.txt hadoop-tools/hadoop-aws: The patch generated 11 new + 25 unchanged - 0 fixed = 36 total (was 25)
+1 💚 mvnsite 0m 35s the patch passed
-1 ❌ javadoc 0m 30s /patch-javadoc-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.txt hadoop-aws in the patch failed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.
-1 ❌ javadoc 0m 26s /patch-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.txt hadoop-aws in the patch failed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.
+1 💚 spotbugs 1m 13s the patch passed
+1 💚 shadedclient 39m 4s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 2m 49s hadoop-aws in the patch passed.
+1 💚 asflicense 0m 36s The patch does not generate ASF License warnings.
130m 26s
Subsystem Report/Notes
Docker ClientAPI=1.47 ServerAPI=1.47 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7214/9/artifact/out/Dockerfile
GITHUB PR #7214
Optional Tests dupname asflicense codespell detsecrets xmllint compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle markdownlint
uname Linux 7aa7731515a7 5.15.0-125-generic #135-Ubuntu SMP Fri Sep 27 13:53:58 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / e7e454c
Default Java Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7214/9/testReport/
Max. process+thread count 529 (vs. ulimit of 5500)
modules C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7214/9/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

Copy link
Contributor

@mukund-thakur mukund-thakur left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall I like the design and refactoring.
One thought, can we make minimal prefetching changes in this PR and only focus on the interface and ClassicInputStream and create a separate PR for all prefetching stuff?

@@ -993,7 +983,7 @@ private void initThreadPools(Configuration conf) {
unboundedThreadPool.allowCoreThreadTimeOut(true);
executorCapacity = intOption(conf,
EXECUTOR_CAPACITY, DEFAULT_EXECUTOR_CAPACITY, 1);
if (prefetchEnabled) {
if (requirements.createFuturePool()) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

change the name to prefetchRequirements.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

there's more requirements than just prefetching, e.g if vector IO support is needed then some extra threads are added to the pool passed down.

@steveloughran
Copy link
Contributor Author

I'm just setting this up so it is ready for the analytics stream work...making sure that prefetch is also covered is my way to validate the factory model, and that the options need to include things like the options to ask for a shared thread pool and stream thread pool, with the intent that analytics will use that too.

And once I do that, they all need a single base stream class.

For my vector IO resilience PR, once I have this PR in, I'm going to go back to #7105 and make it something which works with all object input streams

  • probe the stream for being "all in memory"; if so just do the reads sequentially, no need to parallelize.
  • if "partially in memory", give implementation that list of ranges and have them split into "all in memory" and "needs retrieval". again, in memory blocks can be filled in immediately (needs a lock on removing cache items)
  • range coalesce
  • sort by largest range first (stops the tail being the bottleneck)
  • queue for reading

read failure

  1. single range: retry
  2. merged range: complete successfully read parts
  3. and incomplete parts are split into their originals, reread individually in same thread, with retries on them

the read failure stuff is essentially in my PR, so maybe we can rebase onto this, merge in and then pull up. Goal: analytics stream gets vector IO.

@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 50s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 1s No case conflicting files found.
+0 🆗 codespell 0m 1s codespell was not available.
+0 🆗 detsecrets 0m 1s detect-secrets was not available.
+0 🆗 xmllint 0m 1s xmllint was not available.
+0 🆗 markdownlint 0m 0s markdownlint was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 18 new or modified test files.
_ trunk Compile Tests _
+1 💚 mvninstall 39m 17s trunk passed
+1 💚 compile 0m 44s trunk passed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04
+1 💚 compile 0m 35s trunk passed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
+1 💚 checkstyle 0m 31s trunk passed
+1 💚 mvnsite 0m 41s trunk passed
+1 💚 javadoc 0m 41s trunk passed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04
+1 💚 javadoc 0m 33s trunk passed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
+1 💚 spotbugs 1m 8s trunk passed
+1 💚 shadedclient 37m 31s branch has no errors when building and testing our client artifacts.
-0 ⚠️ patch 37m 53s Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary.
_ Patch Compile Tests _
+1 💚 mvninstall 0m 29s the patch passed
+1 💚 compile 0m 36s the patch passed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04
+1 💚 javac 0m 36s the patch passed
+1 💚 compile 0m 27s the patch passed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
+1 💚 javac 0m 27s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
-0 ⚠️ checkstyle 0m 21s /results-checkstyle-hadoop-tools_hadoop-aws.txt hadoop-tools/hadoop-aws: The patch generated 11 new + 25 unchanged - 0 fixed = 36 total (was 25)
+1 💚 mvnsite 0m 32s the patch passed
-1 ❌ javadoc 0m 29s /patch-javadoc-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.txt hadoop-aws in the patch failed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.
-1 ❌ javadoc 0m 25s /patch-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.txt hadoop-aws in the patch failed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.
+1 💚 spotbugs 1m 7s the patch passed
+1 💚 shadedclient 37m 7s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 2m 45s hadoop-aws in the patch passed.
+1 💚 asflicense 0m 36s The patch does not generate ASF License warnings.
129m 2s
Subsystem Report/Notes
Docker ClientAPI=1.47 ServerAPI=1.47 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7214/10/artifact/out/Dockerfile
GITHUB PR #7214
Optional Tests dupname asflicense codespell detsecrets xmllint compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle markdownlint
uname Linux 6f6ef8b7b272 5.15.0-124-generic #134-Ubuntu SMP Fri Sep 27 20:20:17 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / c35c915
Default Java Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7214/10/testReport/
Max. process+thread count 608 (vs. ulimit of 5500)
modules C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7214/10/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

this.ioStatistics = streamStatistics.getIOStatistics();
this.inputPolicy = context.getInputPolicy();
streamStatistics.inputPolicySet(inputPolicy.ordinal());
this.boundedThreadPool = parameters.getBoundedThreadPool();

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see boundedThreadPool is used in S3AInputStream but not in S3APrefetchingInputStream, can we keep boundedThreadPool local to S3AInputStream?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

each stream can declare what it wants thread-pool wise and we will allocate those to them. If they don't want it, they don't get it.
That bounded thread pool passed down is the semaphore pool we also use in uploads. It takes a subset of the shared pool, has its own pending queue and blocks the caller thread when that pending queue is full.

If the analytics stream doesn't currently need it -don't ask for any

But I do want to have the vector IO code to be moved out of S3AInputStream so it can work with the superclass, so all streams get it. These also want a bounded number of threads


/**
* A stream of data from an S3 object.
* The blase class includes common methods, stores

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: spelling base

* This must be re-invoked after replacing the S3Client during test
* runs.
* <p>
* It requires the S3Store to have been instantiated.
* @param conf configuration.
Copy link

@rajdchak rajdchak Jan 16, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@param conf is no longer required

* @param sharedThreads Number of shared threads to included in the bounded pool.
* @param streamThreads How many threads per stream, ignoring vector IO requirements.
* @param createFuturePool Flag to enable creation of a future pool around the bounded thread pool.
*/

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@param vectorSupported missing

@@ -845,7 +826,7 @@ private S3AFileSystemOperations createFileSystemHandler() {
@VisibleForTesting
protected S3AStore createS3AStore(final ClientManager clientManager,
final int rateLimitCapacity) {
return new S3AStoreBuilder()
final S3AStore st = new S3AStoreBuilder()

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: rename variable to meaningful name

@steveloughran
Copy link
Contributor Author

@rajdchak thanks for the comments, will address

I do want to pull up the vector IO support, with integration with prefetch and cacheing.

For prefetch/caching stream we'd ask for a the requested ranges to be split up into

  1. ranges which were wholly in memory: satisfy immediately in current thread (or copier thread?)
  2. ranges which have an active prefetch to wholly satisfy the request: somehow wire prefetching up so as soon as it arrives, range gets the data.
  3. other ranges (not cached, prefetched or only partially in cache): coalesce as needed, then retrieve. +notify stream that these ranges are being fetched, so no need to prefetch

It'd be good to collect stats on cache hit/miss here, to assess integration of vector reads with ranges. When a list of ranges comes down, there is less need to infer the next range and prefetch, and I'm not actually sure how important cacheing becomes. This is why setting parquet up to use vector IO already appears to give speedups comparable to the analytics stream benchmarks published.

what I want is best of both worlds: prefetch of rowgroups from stream inference -and when vector reads come in, statisfy those by returning current/active prefetches, or retrieve new ranges through ranged GET requests.

#7105 is where that will go; I've halted that until this is in. And I'll only worry about that integration with prefetched/cached blocks with the analytics stream.

Copy link
Contributor

@ahmarsuhail ahmarsuhail left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @steveloughran, looks good to me overall. Just need to allow for the ClientManager to be passed into the factory.

: 0);
// create an executor which is a subset of the
// bounded thread pool.
final SemaphoredDelegatingExecutor pool = new SemaphoredDelegatingExecutor(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a clarifying question, what is the benefit of creating a new SemaphoredDelegatingExecutor per stream vs just creating this once?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok I think I get it, this is basically a way to ensure a single stream instance does not use up too many threads.

@steveloughran
Copy link
Contributor Author

@ahmarsuhail will look at it. just rebase and review of this; last failure seems VM rather than code

@ahmarsuhail
Copy link
Contributor

@steveloughran do you want to merge this PR into trunk? Or do you want this to go in via our feature branch?

So either this PR goes into trunk directly, or it can go in as part of the feature branch.

ahmarsuhail added a commit that referenced this pull request Jan 27, 2025
@steveloughran steveloughran force-pushed the s3/HADOOP-19354-s3a-inputstream-factory branch from 88ee1d2 to b5346a1 Compare January 27, 2025 18:16
@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 18m 11s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 1s No case conflicting files found.
+0 🆗 codespell 0m 1s codespell was not available.
+0 🆗 detsecrets 0m 1s detect-secrets was not available.
+0 🆗 xmllint 0m 1s xmllint was not available.
+0 🆗 markdownlint 0m 0s markdownlint was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 18 new or modified test files.
_ trunk Compile Tests _
+1 💚 mvninstall 41m 21s trunk passed
+1 💚 compile 0m 42s trunk passed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04
+1 💚 compile 0m 34s trunk passed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
+1 💚 checkstyle 0m 33s trunk passed
+1 💚 mvnsite 0m 40s trunk passed
+1 💚 javadoc 0m 41s trunk passed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04
+1 💚 javadoc 0m 33s trunk passed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
+1 💚 spotbugs 1m 7s trunk passed
+1 💚 shadedclient 38m 24s branch has no errors when building and testing our client artifacts.
-0 ⚠️ patch 38m 45s Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary.
_ Patch Compile Tests _
+1 💚 mvninstall 0m 30s the patch passed
+1 💚 compile 0m 37s the patch passed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04
+1 💚 javac 0m 37s the patch passed
+1 💚 compile 0m 28s the patch passed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
+1 💚 javac 0m 28s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
-0 ⚠️ checkstyle 0m 20s /results-checkstyle-hadoop-tools_hadoop-aws.txt hadoop-tools/hadoop-aws: The patch generated 1 new + 14 unchanged - 11 fixed = 15 total (was 25)
+1 💚 mvnsite 0m 32s the patch passed
-1 ❌ javadoc 0m 30s /patch-javadoc-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.txt hadoop-aws in the patch failed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.
-1 ❌ javadoc 0m 25s /patch-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.txt hadoop-aws in the patch failed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.
+1 💚 spotbugs 1m 7s the patch passed
+1 💚 shadedclient 38m 19s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 0m 32s hadoop-aws in the patch passed.
+1 💚 asflicense 0m 35s The patch does not generate ASF License warnings.
148m 18s
Subsystem Report/Notes
Docker ClientAPI=1.47 ServerAPI=1.47 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7214/13/artifact/out/Dockerfile
GITHUB PR #7214
Optional Tests dupname asflicense codespell detsecrets xmllint compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle markdownlint
uname Linux 908740910b43 5.15.0-130-generic #140-Ubuntu SMP Wed Dec 18 17:59:53 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / b5346a1
Default Java Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7214/13/testReport/
Max. process+thread count 607 (vs. ulimit of 5500)
modules C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7214/13/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@steveloughran
Copy link
Contributor Author

Do you think we should fallback if a stream factory fails to load? as if they depend on 3rd party libraries those libs may not be deployed across the cluster

Good: something works
Bad: you don't know what you've got.

We can/should add an iostats gauge to indicate which indicates which stream is in use -serve it up in FS and stream

@ahmarsuhail
Copy link
Contributor

@steveloughran personally think we should throw the failure and not have the fallback. Users of both prefetching input stream and AAL will expect performance benefits from using them, and if the failures are not visible, it'll lead to people thinking those streams aren't any faster.

@steveloughran
Copy link
Contributor Author

@ahmarsuhail +1

now, unrelated issue. It looks to me like the jersey update's associated junit stuff has stopped tests being discovered in hadoop-aws. I'm rebasing this PR onto the commit before that one just so I can make progress.

Can you check out and build trunk and tell me if your run of the hadoop-aws unit tests run any tests -or is it my setup (across both git clones i have of the repo)

@steveloughran steveloughran changed the title HADOOP-19354. S3AInputStream to be created by factory under S3AStore HADOOP-19354. S3A: S3AInputStream to be created by factory under S3AStore Jan 28, 2025
@ahmarsuhail
Copy link
Contributor

@steveloughran I just hit the same issue on my CRT PR, unable to run tests :(

[INFO] -------------------------------------------------------
[INFO]  T E S T S
[INFO] -------------------------------------------------------
[INFO]
[INFO] Results:
[INFO]
[INFO] Tests run: 0, Failures: 0, Errors: 0, Skipped: 0
[INFO]
[INFO]
[INFO] --- failsafe:3.0.0-M1:integration-test (sequential-integration-tests) @ hadoop-aws ---
[INFO]
[INFO] -------------------------------------------------------
[INFO]  T E S T S
[INFO] -------------------------------------------------------
[INFO]
[INFO] Results:
[INFO]
[INFO] Tests run: 0, Failures: 0, Errors: 0, Skipped: 0
[INFO]
[INFO]
[INFO] --- enforcer:3.5.0:enforce (depcheck) @ hadoop-aws ---
[INFO] Rule 0: org.apache.maven.enforcer.rules.dependency.DependencyConvergence passed
[INFO] Rule 1: org.apache.maven.enforcer.rules.dependency.BannedDependencies passed
[INFO]
[INFO] --- failsafe:3.0.0-M1:verify (default-integration-test) @ hadoop-aws ---
[INFO]
[INFO] --- failsafe:3.0.0-M1:verify (sequential-integration-tests) @ hadoop-aws ---

@steveloughran steveloughran force-pushed the s3/HADOOP-19354-s3a-inputstream-factory branch from b5346a1 to 745492d Compare January 28, 2025 16:32
S3 InputStreams are created by a factory class, with the
choice of factory dynamically chosen by the option

  fs.s3a.input.stream.type

Supported values: classic, prefetching, analytics.

S3AStore

* Manages the creation and service lifecycle of the chosen factory,
  as well as forwarding stream construction requests to the chosen factory.
* Provides the callbacks needed by both the factories and input streams.
* StreamCapabilities.hasCapability(), which is
  relayed to the active factory. This avoids the FS having
  to know what capabilities are available in the stream.
@steveloughran steveloughran force-pushed the s3/HADOOP-19354-s3a-inputstream-factory branch from 745492d to 9c8e753 Compare January 28, 2025 16:33
Ability to create custom streams (type = custom), which
reads class from "fs.s3a.input.stream.custom.factory".
This is mainly for testing, especially CNFE and similar.

Unit test TestStreamFactories for this.

ObjectInputStreams save and export stream type to assist
these tests too, as it enables assertions on the generated
stream type.

Simplified that logic related to the old prefetch enabled flag

If fs.s3a.prefetch.enabled is true, the prefetch stream is returned,
the stream.type option is not used at all. Simpler logic, simpler
docs, fewer support calls.

Parameters supplied to ObjectInputStreamFactory.bind converted
to a parameter object. Allows for more parameters to be added later
if ever required.

ObjectInputStreamFactory returns more requirements to
the store/fs. For this reason
  StreamThreadOptions threadRequirements();
is renamed
  StreamFactoryRequirements factoryRequirements()

VectorIO context changes
* Returned in factoryRequirements()
* exiting configuration reading code moved into
  StreamIntegration.populateVectoredIOContext()
* Streams which don't have custom vector IO, e.g. prefetching
  can return a minimum seek range of 0.
  This disables range merging on the default PositionedReadable
  implementation, so ensures that they will only get asked for
  data which will be read...leaving prefetch/cache code
  to know exactly what is needed.

Other
 * Draft docs.
 * Stream capability declares stream type
   & is exported through FS too.
   (todo: test, document, add to bucket-info)
 * ConfigurationHelper.resolveEnum() supercedes
   Configuration.getEnum() with
     - case independence
     - fallback is a supplier<Enum> rather than a simple
       value.

Change-Id: I2e59300af48042df8173de61d0b3d6139a0ae7fe
@apache apache deleted a comment from hadoop-yetus Jan 30, 2025
@apache apache deleted a comment from hadoop-yetus Jan 30, 2025
@steveloughran
Copy link
Contributor Author

  • big new version, lots of changes
  • not compatible with anyone rebasing -but I think this is stabilising now. Sorry!
  • this PR is based on the last hadoop-trunk where the tests ran

Not fully tested yet. I want to have the stream type passed down as a -D option

@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 51s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 1s No case conflicting files found.
+0 🆗 codespell 0m 1s codespell was not available.
+0 🆗 detsecrets 0m 1s detect-secrets was not available.
+0 🆗 xmllint 0m 1s xmllint was not available.
+0 🆗 markdownlint 0m 1s markdownlint was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 19 new or modified test files.
_ trunk Compile Tests _
+0 🆗 mvndep 6m 4s Maven dependency ordering for branch
+1 💚 mvninstall 36m 58s trunk passed
+1 💚 compile 19m 35s trunk passed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04
+1 💚 compile 17m 42s trunk passed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
+1 💚 checkstyle 4m 39s trunk passed
+1 💚 mvnsite 2m 32s trunk passed
+1 💚 javadoc 2m 7s trunk passed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04
+1 💚 javadoc 1m 38s trunk passed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
+1 💚 spotbugs 3m 53s trunk passed
+1 💚 shadedclient 39m 49s branch has no errors when building and testing our client artifacts.
-0 ⚠️ patch 40m 16s Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary.
_ Patch Compile Tests _
+0 🆗 mvndep 0m 32s Maven dependency ordering for patch
+1 💚 mvninstall 1m 27s the patch passed
+1 💚 compile 18m 34s the patch passed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04
+1 💚 javac 18m 34s the patch passed
+1 💚 compile 17m 29s the patch passed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
+1 💚 javac 17m 29s the patch passed
-1 ❌ blanks 0m 0s /blanks-eol.txt The patch has 4 line(s) that end in blanks. Use git apply --whitespace=fix <<patch_file>>. Refer https://git-scm.com/docs/git-apply
-0 ⚠️ checkstyle 4m 28s /results-checkstyle-root.txt root: The patch generated 15 new + 13 unchanged - 12 fixed = 28 total (was 25)
+1 💚 mvnsite 2m 29s the patch passed
-1 ❌ javadoc 0m 50s /results-javadoc-javadoc-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.txt hadoop-tools_hadoop-aws-jdkUbuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04 with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04 generated 3 new + 0 unchanged - 0 fixed = 3 total (was 0)
-1 ❌ javadoc 0m 47s /results-javadoc-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.txt hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga generated 3 new + 0 unchanged - 0 fixed = 3 total (was 0)
+1 💚 spotbugs 4m 8s the patch passed
+1 💚 shadedclient 40m 31s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 20m 28s hadoop-common in the patch passed.
+1 💚 unit 0m 51s hadoop-aws in the patch passed.
+1 💚 asflicense 1m 3s The patch does not generate ASF License warnings.
255m 42s
Subsystem Report/Notes
Docker ClientAPI=1.47 ServerAPI=1.47 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7214/16/artifact/out/Dockerfile
GITHUB PR #7214
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint markdownlint
uname Linux 3a4c42bd7f8d 5.15.0-125-generic #135-Ubuntu SMP Fri Sep 27 13:53:58 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / eadf0dd
Default Java Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7214/16/testReport/
Max. process+thread count 3140 (vs. ulimit of 5500)
modules C: hadoop-common-project/hadoop-common hadoop-tools/hadoop-aws U: .
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7214/16/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

Change-Id: Ib2053402752b05e8388396909251f6cd59bd9cb7
To avoid confusion when output stream support is pushed down.

-Input stream capabilities exported via {inputStreamHasCapability()};
hasCapability() always returns false.

Change-Id: I5836e0c14a6a781b32888baae178883ba47aab9c
TODO: capability probe tests in fs.
@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 33s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 1s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+0 🆗 xmllint 0m 0s xmllint was not available.
+0 🆗 markdownlint 0m 0s markdownlint was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 19 new or modified test files.
_ trunk Compile Tests _
+0 🆗 mvndep 6m 38s Maven dependency ordering for branch
+1 💚 mvninstall 42m 9s trunk passed
-1 ❌ compile 6m 54s /branch-compile-root-jdkUbuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.txt root in trunk failed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.
-1 ❌ compile 5m 43s /branch-compile-root-jdkPrivateBuild-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.txt root in trunk failed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.
-0 ⚠️ checkstyle 0m 28s /buildtool-branch-checkstyle-root.txt The patch fails to run checkstyle in root
-1 ❌ mvnsite 0m 31s /branch-mvnsite-hadoop-common-project_hadoop-common.txt hadoop-common in trunk failed.
-1 ❌ mvnsite 0m 31s /branch-mvnsite-hadoop-tools_hadoop-aws.txt hadoop-aws in trunk failed.
-1 ❌ javadoc 0m 31s /branch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.txt hadoop-common in trunk failed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.
-1 ❌ javadoc 0m 31s /branch-javadoc-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.txt hadoop-aws in trunk failed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.
-1 ❌ javadoc 0m 32s /branch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.txt hadoop-common in trunk failed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.
-1 ❌ javadoc 0m 30s /branch-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.txt hadoop-aws in trunk failed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.
-1 ❌ spotbugs 0m 31s /branch-spotbugs-hadoop-common-project_hadoop-common.txt hadoop-common in trunk failed.
-1 ❌ spotbugs 0m 31s /branch-spotbugs-hadoop-tools_hadoop-aws.txt hadoop-aws in trunk failed.
+1 💚 shadedclient 5m 54s branch has no errors when building and testing our client artifacts.
-0 ⚠️ patch 6m 25s Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary.
_ Patch Compile Tests _
+0 🆗 mvndep 0m 21s Maven dependency ordering for patch
-1 ❌ mvninstall 0m 23s /patch-mvninstall-hadoop-common-project_hadoop-common.txt hadoop-common in the patch failed.
-1 ❌ mvninstall 0m 23s /patch-mvninstall-hadoop-tools_hadoop-aws.txt hadoop-aws in the patch failed.
-1 ❌ compile 0m 23s /patch-compile-root-jdkUbuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.txt root in the patch failed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.
-1 ❌ javac 0m 23s /patch-compile-root-jdkUbuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.txt root in the patch failed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.
-1 ❌ compile 0m 23s /patch-compile-root-jdkPrivateBuild-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.txt root in the patch failed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.
-1 ❌ javac 0m 23s /patch-compile-root-jdkPrivateBuild-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.txt root in the patch failed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.
+1 💚 blanks 0m 0s The patch has no blanks issues.
-0 ⚠️ checkstyle 0m 21s /buildtool-patch-checkstyle-root.txt The patch fails to run checkstyle in root
-1 ❌ mvnsite 0m 23s /patch-mvnsite-hadoop-common-project_hadoop-common.txt hadoop-common in the patch failed.
-1 ❌ mvnsite 0m 15s /patch-mvnsite-hadoop-tools_hadoop-aws.txt hadoop-aws in the patch failed.
-1 ❌ javadoc 0m 24s /patch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.txt hadoop-common in the patch failed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.
-1 ❌ javadoc 0m 22s /patch-javadoc-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.txt hadoop-aws in the patch failed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.
-1 ❌ javadoc 0m 15s /patch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.txt hadoop-common in the patch failed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.
-1 ❌ javadoc 0m 24s /patch-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.txt hadoop-aws in the patch failed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.
-1 ❌ spotbugs 0m 23s /patch-spotbugs-hadoop-common-project_hadoop-common.txt hadoop-common in the patch failed.
-1 ❌ spotbugs 0m 24s /patch-spotbugs-hadoop-tools_hadoop-aws.txt hadoop-aws in the patch failed.
+1 💚 shadedclient 5m 37s patch has no errors when building and testing our client artifacts.
_ Other Tests _
-1 ❌ unit 0m 24s /patch-unit-hadoop-common-project_hadoop-common.txt hadoop-common in the patch failed.
-1 ❌ unit 0m 23s /patch-unit-hadoop-tools_hadoop-aws.txt hadoop-aws in the patch failed.
+0 🆗 asflicense 0m 23s ASF License check generated no output?
78m 29s
Subsystem Report/Notes
Docker ClientAPI=1.47 ServerAPI=1.47 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7214/18/artifact/out/Dockerfile
GITHUB PR #7214
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint markdownlint
uname Linux b7e2b98184e2 5.15.0-125-generic #135-Ubuntu SMP Fri Sep 27 13:53:58 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 7749ad3
Default Java Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7214/18/testReport/
Max. process+thread count 175 (vs. ulimit of 5500)
modules C: hadoop-common-project/hadoop-common hadoop-tools/hadoop-aws U: .
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7214/18/console
versions git=2.25.1 maven=3.6.3
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@hadoop-yetus
Copy link

🎊 +1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 22m 37s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 1s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+0 🆗 xmllint 0m 0s xmllint was not available.
+0 🆗 markdownlint 0m 0s markdownlint was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 19 new or modified test files.
_ trunk Compile Tests _
+0 🆗 mvndep 7m 20s Maven dependency ordering for branch
+1 💚 mvninstall 36m 48s trunk passed
+1 💚 compile 19m 38s trunk passed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04
+1 💚 compile 18m 21s trunk passed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
+1 💚 checkstyle 4m 32s trunk passed
+1 💚 mvnsite 2m 31s trunk passed
+1 💚 javadoc 2m 4s trunk passed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04
+1 💚 javadoc 1m 38s trunk passed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
+1 💚 spotbugs 3m 49s trunk passed
+1 💚 shadedclient 38m 33s branch has no errors when building and testing our client artifacts.
-0 ⚠️ patch 39m 0s Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary.
_ Patch Compile Tests _
+0 🆗 mvndep 0m 31s Maven dependency ordering for patch
+1 💚 mvninstall 1m 26s the patch passed
+1 💚 compile 18m 42s the patch passed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04
+1 💚 javac 18m 42s the patch passed
+1 💚 compile 17m 50s the patch passed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
+1 💚 javac 17m 50s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
+1 💚 checkstyle 4m 34s root: The patch generated 0 new + 13 unchanged - 12 fixed = 13 total (was 25)
+1 💚 mvnsite 2m 30s the patch passed
+1 💚 javadoc 2m 1s the patch passed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04
+1 💚 javadoc 1m 40s the patch passed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
+1 💚 spotbugs 4m 7s the patch passed
+1 💚 shadedclient 39m 43s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 1m 47s hadoop-common in the patch passed.
+1 💚 unit 3m 16s hadoop-aws in the patch passed.
+1 💚 asflicense 1m 2s The patch does not generate ASF License warnings.
261m 14s
Subsystem Report/Notes
Docker ClientAPI=1.47 ServerAPI=1.47 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7214/17/artifact/out/Dockerfile
GITHUB PR #7214
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint markdownlint
uname Linux bf5d43b660eb 5.15.0-125-generic #135-Ubuntu SMP Fri Sep 27 13:53:58 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 845adb8
Default Java Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7214/17/testReport/
Max. process+thread count 581 (vs. ulimit of 5500)
modules C: hadoop-common-project/hadoop-common hadoop-tools/hadoop-aws U: .
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7214/17/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

StreamFactoryRequirements takes a list of requirement flags, and
moves off boolean flags. (incompatible change).

A new flag indicates that the stream may issue requests outside
an audit span. This is used when initializing the audit manager
so out-of-span requests are never rejected.

The requirement on vector IO is removed, instead set the #of
required threads in VectoredIOContext to zero (may revisit this)

VectoredIOContext moved to streams package and build() operation
makes immutable

* factory.bind() can throw an IOE

Change-Id: If2e121e6d3c0c6d19ac6f4b8452752a8bfafd3d1
Copy link
Contributor

@ahmarsuhail ahmarsuhail left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @steveloughran,

I had a question about serviceStop()/close(), not at which point the stream factories are acutally closed. For AAL this is quite important, as the individual stream does not hold any resources..but the factory does, so we need to call close on the factory to release any resources on FS close.

Was also thinking about the lazy creation discussion..and wondering if we should make the factory init lazy in S3AStore.serviceInit(), and only create it on the first executeOpen()..rather than asking implementations bind to be lazy. Know this isn't trivial, so maybe not worth it.

@@ -4354,22 +4257,25 @@ public void close() throws IOException {
protected synchronized void stopAllServices() {
try {
trackDuration(getDurationTrackerFactory(), FILESYSTEM_CLOSE.getSymbol(), () -> {
closeAutocloseables(LOG, store);
closeAutocloseables(LOG, getStore());
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@steveloughran can you explain the lifecycle of things with the Service stuff?

We're doing closeAutocloseables(LOG, getStore()); here, but Store doesn't actually implement AutoCloseable. So I'm not sure if we need to something here (call serviceStop()?)

Also need to figure out what changes I need to make to close our s3SeekableInputStreamFactory, prior to this changes, I was passing in our AAL factory, in this closeAutocloseables(), which would call .s3SeekableInputStreamFactory.close().

@Override
public boolean hasPathCapability(final Path path, final String capability) {
switch (toLowerCase(capability)) {
case StreamCapabilities.IOSTATISTICS:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

AAL doesn't have IoStats yet, so this should probably not return true without checking

Comment on lines +76 to +77
case StreamCapabilities.IOSTATISTICS:
case StreamStatisticNames.STREAM_LEAKS:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

similar to above, think we should just let the stream factories define what capabilities they have rather than doing it on the parent class. As AAL does not have IoStats or stream leaks right now

* and {@code start()}.
* @param factoryBindingParameters parameters for the factory binding
*/
void bind(FactoryBindingParameters factoryBindingParameters);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

discussed offline: this should throw IoException as createAsyncClient() throws IoException

@github-actions github-actions bot added the build label Feb 5, 2025
@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 53s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 1s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+0 🆗 xmllint 0m 0s xmllint was not available.
+0 🆗 markdownlint 0m 0s markdownlint was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 22 new or modified test files.
_ trunk Compile Tests _
+0 🆗 mvndep 6m 33s Maven dependency ordering for branch
+1 💚 mvninstall 37m 6s trunk passed
+1 💚 compile 22m 58s trunk passed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04
+1 💚 compile 20m 38s trunk passed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
+1 💚 checkstyle 4m 35s trunk passed
+1 💚 mvnsite 2m 32s trunk passed
+1 💚 javadoc 2m 5s trunk passed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04
+1 💚 javadoc 1m 38s trunk passed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
+1 💚 spotbugs 3m 49s trunk passed
+1 💚 shadedclient 38m 59s branch has no errors when building and testing our client artifacts.
-0 ⚠️ patch 39m 28s Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary.
_ Patch Compile Tests _
+0 🆗 mvndep 0m 32s Maven dependency ordering for patch
+1 💚 mvninstall 1m 29s the patch passed
+1 💚 compile 21m 20s the patch passed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04
+1 💚 javac 21m 20s the patch passed
+1 💚 compile 19m 48s the patch passed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
+1 💚 javac 19m 48s the patch passed
-1 ❌ blanks 0m 0s /blanks-eol.txt The patch has 2 line(s) that end in blanks. Use git apply --whitespace=fix <<patch_file>>. Refer https://git-scm.com/docs/git-apply
-0 ⚠️ checkstyle 4m 35s /results-checkstyle-root.txt root: The patch generated 7 new + 14 unchanged - 12 fixed = 21 total (was 26)
+1 💚 mvnsite 2m 30s the patch passed
-1 ❌ javadoc 0m 50s /patch-javadoc-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.txt hadoop-aws in the patch failed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.
-1 ❌ javadoc 0m 47s /patch-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.txt hadoop-aws in the patch failed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.
+1 💚 spotbugs 4m 11s the patch passed
+1 💚 shadedclient 40m 2s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 3m 8s hadoop-common in the patch passed.
+1 💚 unit 3m 19s hadoop-aws in the patch passed.
+1 💚 asflicense 1m 3s The patch does not generate ASF License warnings.
251m 47s
Subsystem Report/Notes
Docker ClientAPI=1.47 ServerAPI=1.47 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7214/19/artifact/out/Dockerfile
GITHUB PR #7214
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint markdownlint
uname Linux f6409ac6e5d0 5.15.0-125-generic #135-Ubuntu SMP Fri Sep 27 13:53:58 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 88d31d4
Default Java Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7214/19/testReport/
Max. process+thread count 644 (vs. ulimit of 5500)
modules C: hadoop-common-project/hadoop-common hadoop-tools/hadoop-aws U: .
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7214/19/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

This is really hard to do; I've had to roll back the initial
attempt because it mandated a loop in init

auditor -> request factory -> store -> stream -> requirements

Proposed: allow the request factory to have its handler callback
updatable after creation.

It is nominally possible to set a build factory through maven

-Dstream=prefetch

However, this isn't being picked up as can be seen with runs of
-Dstream=custom
-Dstream=unknown

MUST fail. they currently don't

Change-Id: I8343c8c9bf0a8cd1b353c7c8f4cecf7f569a4a28
@steveloughran steveloughran force-pushed the s3/HADOOP-19354-s3a-inputstream-factory branch from 88d31d4 to 677eb50 Compare February 6, 2025 16:20
@hadoop-yetus
Copy link

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 51s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 1s No case conflicting files found.
+0 🆗 codespell 0m 1s codespell was not available.
+0 🆗 detsecrets 0m 1s detect-secrets was not available.
+0 🆗 xmllint 0m 1s xmllint was not available.
+0 🆗 markdownlint 0m 1s markdownlint was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 22 new or modified test files.
_ trunk Compile Tests _
+0 🆗 mvndep 6m 49s Maven dependency ordering for branch
+1 💚 mvninstall 35m 57s trunk passed
+1 💚 compile 19m 14s trunk passed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04
+1 💚 compile 17m 36s trunk passed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
+1 💚 checkstyle 4m 41s trunk passed
+1 💚 mvnsite 2m 35s trunk passed
+1 💚 javadoc 2m 7s trunk passed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04
+1 💚 javadoc 1m 38s trunk passed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
+1 💚 spotbugs 3m 50s trunk passed
+1 💚 shadedclient 39m 36s branch has no errors when building and testing our client artifacts.
-0 ⚠️ patch 40m 4s Used diff version of patch file. Binary files and potentially other changes not applied. Please rebase and squash commits if necessary.
_ Patch Compile Tests _
+0 🆗 mvndep 0m 32s Maven dependency ordering for patch
-1 ❌ mvninstall 0m 21s /patch-mvninstall-hadoop-tools_hadoop-aws.txt hadoop-aws in the patch failed.
-1 ❌ compile 17m 38s /patch-compile-root-jdkUbuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.txt root in the patch failed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.
-1 ❌ javac 17m 38s /patch-compile-root-jdkUbuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.txt root in the patch failed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.
-1 ❌ compile 16m 48s /patch-compile-root-jdkPrivateBuild-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.txt root in the patch failed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.
-1 ❌ javac 16m 48s /patch-compile-root-jdkPrivateBuild-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.txt root in the patch failed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.
-1 ❌ blanks 0m 0s /blanks-eol.txt The patch has 2 line(s) that end in blanks. Use git apply --whitespace=fix <<patch_file>>. Refer https://git-scm.com/docs/git-apply
-0 ⚠️ checkstyle 4m 35s /results-checkstyle-root.txt root: The patch generated 6 new + 15 unchanged - 12 fixed = 21 total (was 27)
-1 ❌ mvnsite 0m 44s /patch-mvnsite-hadoop-tools_hadoop-aws.txt hadoop-aws in the patch failed.
-1 ❌ javadoc 0m 50s /patch-javadoc-hadoop-tools_hadoop-aws-jdkUbuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.txt hadoop-aws in the patch failed with JDK Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04.
-1 ❌ javadoc 0m 45s /patch-javadoc-hadoop-tools_hadoop-aws-jdkPrivateBuild-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.txt hadoop-aws in the patch failed with JDK Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga.
-1 ❌ spotbugs 0m 41s /patch-spotbugs-hadoop-tools_hadoop-aws.txt hadoop-aws in the patch failed.
+1 💚 shadedclient 41m 52s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 5m 17s hadoop-common in the patch passed.
-1 ❌ unit 0m 42s /patch-unit-hadoop-tools_hadoop-aws.txt hadoop-aws in the patch failed.
+1 💚 asflicense 1m 1s The patch does not generate ASF License warnings.
236m 49s
Subsystem Report/Notes
Docker ClientAPI=1.47 ServerAPI=1.47 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7214/20/artifact/out/Dockerfile
GITHUB PR #7214
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets xmllint markdownlint
uname Linux 732b363fd39c 5.15.0-130-generic #140-Ubuntu SMP Wed Dec 18 17:59:53 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 677eb50
Default Java Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.25+9-post-Ubuntu-1ubuntu120.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_432-8u432-gaus1-0ubuntu220.04-ga
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7214/20/testReport/
Max. process+thread count 607 (vs. ulimit of 5500)
modules C: hadoop-common-project/hadoop-common hadoop-tools/hadoop-aws U: .
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-7214/20/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants