diff --git a/.github/workflows/rust-test.yml b/.github/workflows/rust-test.yml index 87547f3..031151b 100644 --- a/.github/workflows/rust-test.yml +++ b/.github/workflows/rust-test.yml @@ -48,7 +48,7 @@ jobs: run: sudo apt-get install -y libkrb5-dev libgsasl-dev - name: build and lint with clippy - run: cargo clippy --tests --features kerberos,token,integration-test,rs + run: cargo clippy --tests --features kerberos,token,integration-test - name: Check docs run: cargo doc @@ -66,7 +66,7 @@ jobs: run: cargo check --features object_store - name: Check all features - run: cargo check --features kerberos,token,object_store,rs,integration-test + run: cargo check --features kerberos,token,object_store,integration-test test: strategy: @@ -109,4 +109,4 @@ jobs: echo "$GITHUB_WORKSPACE/hadoop-3.3.6/bin" >> $GITHUB_PATH - name: Run tests - run: cargo test --features kerberos,token,object_store,integration-test,rs + run: cargo test --features kerberos,token,object_store,integration-test diff --git a/README.md b/README.md index 154115d..3c2b47c 100644 --- a/README.md +++ b/README.md @@ -72,7 +72,7 @@ cargo build --features token,kerberos The tests are mostly integration tests that utilize a small Java application in `rust/mindifs/` that runs a custom `MiniDFSCluster`. To run the tests, you need to have Java, Maven, Hadoop binaries, and Kerberos tools available and on your path. Any Java version between 8 and 17 should work. ```bash -cargo test -p hdfs-native --features token,kerberos,rs,intergation-test +cargo test -p hdfs-native --features token,kerberos,intergation-test ``` ### Python tests