Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

#389 Add Spark application id to jobs that fail before running a task. #391

Merged
merged 2 commits into from
Apr 5, 2024

Conversation

yruslan
Copy link
Collaborator

@yruslan yruslan commented Apr 5, 2024

Closes #389

Adds Spark Application Id even for projects that failed to run any tasks.

Screenshot 2024-04-04 at 15 33 42

Copy link

github-actions bot commented Apr 5, 2024

Unit Test Coverage

File Coverage [88.23%] 🍏
PipelineNotification.scala 100% 🍏
PipelineNotificationDirector.scala 100% 🍏
PipelineNotificationBuilderHtml.scala 90.46% 🍏
PipelineStateImpl.scala 83.27% 🍏
AppRunner.scala 81.05% 🍏
Total Project Coverage 83.34% 🍏

@yruslan yruslan marked this pull request as ready for review April 5, 2024 13:25
@yruslan yruslan merged commit a522924 into main Apr 5, 2024
8 checks passed
@yruslan yruslan deleted the feature/389-spark-app-id-for-failed-jobs branch April 5, 2024 13:26
@yruslan yruslan mentioned this pull request Apr 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Spark application Id is not shown when no tasks are executed
1 participant