-
Notifications
You must be signed in to change notification settings - Fork 45
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Spring Kafka Integration, Partial J2735 2024 Compatibility, and Developer Experience Enhancements #562
Merged
radishco
merged 318 commits into
usdot-jpo-ode:develop
from
CDOT-CV:cdot-release_2025-q1
Jan 23, 2025
Merged
Spring Kafka Integration, Partial J2735 2024 Compatibility, and Developer Experience Enhancements #562
radishco
merged 318 commits into
usdot-jpo-ode:develop
from
CDOT-CV:cdot-release_2025-q1
Jan 23, 2025
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
…ation objects (#113) * Migrate kafka configuration values to ODEKafkaProperties This is the first part of the work needed to separate the OdeProperties into multpile more manageable chunks. It introduces the usage of application.yaml and @Autowired in tests to more fully utilize the Spring framework. * remove @component annotation and accept linter errors Adding the @component annotation to the *Receivers caused double instantiations and port binding issues * migrate from application.properties to application.yaml and introducde host-ip config val * add make restart target for easier, faster restarts that don't require a full image rebuild * add missing getHostId test for odekafkaproperties * move odekafkaproperties to kafka package * correctly declare restart target as PHONY in Makefile * move OdeKafkaPropertiesTest to correct test package * add tests to OdePropertiesTest for hostIP and remove unused imports and variables * moved hostID from OdeKafkaProperties to AppContext to keep scope of new class small and directed * correct application.yaml's disabled-topics block * correct formatting in Asn1*JSONTest.java files and fix Sonarlint errors All swaps of argument positions in the mentioned tests are to align the function calls to the defition: assert*(expected, actual). Before this swap, the call was made in the tests like assert*(actual, expected) which has the same outcome when tests *pass* but is confusing when tests do not pass because the failure output will have the actual and expected values swapped in the message. * replace complex if-else nest with simpler switch statments in decoded data router * extract BSM properties from OdeProperties * correct application.yaml indentation for bsm block * reformat GenericReceiver * if-else to switch and pull logic out of nested if-block where possible * define topics structure in application.yaml * convert MessagePublisher to interface and implement in existing publishers * ImporterDirectoryWatcher properties and necessary refactors for related classes * Complete refactoring of UdpServicesController w/ intro of UDPReceiverProperties * add sprint.http.multipart config values to application.yaml * Add TODO for @mcook42 to update README.md with corrected instructions once OdeProperties refactor is completed * remove remaining references to kafka properties from OdeProperties * remove dead test code and add missing constructor param to ImporterDirectoryWatcherTest.java * finalize application.yaml topics structure and config classes * asn1decoder flows updated with new configuration objects * asn1encoder flows updated with new configuration objects * FileSystemStorageService refactor to use new configuration properties * delete unused BSMReceiverPropertiesTest.java * File-related controllers and watchers corrected for new configuration objects * Update TimDepositController with new topics configs * ByteArrayPublisher correct config usage * TimTransmorgrifierTest corrections * LogFileToAsn1CodecPublisherTest corrections * Revert "TimTransmorgrifierTest corrections" This reverts commit 2a27569. * Trivial JMockit test fixes via \@Inject annotations * update test application.yaml * test application.yaml and OdeKafkaPropertiesTest correction * explicitly initialize OdeDataPublisher in OdeDataPublisherTest to avoid JMockit limitations with Set mocking * Adjust ImporterDirectoryWatcherTest.java to confirm directory creation instead of trying to run .run() It doesn't make much sense to test the scheduledExecutor when we already have a test suite for the ImporterProcessor called by the scheduled executor. JMockit doesn't play nicely (easily) with Spring configuration objects, so in the interest of time I refactored this test suite to confirm that we create the necessary directories without the use of JMockit. * OdeStringPublisherTest corrections to avoid JMockit Set mocking limitations * Replace unintentionally deleted license header in MessagePublisher * remove unnecessary comments from UdpServicesController * added missing tests for new configs in kafka package * reformatted LogFileToAsn1CodecPublisher for readability * pull RSUProperties up and refactor tests for clarity in OdePropertiesTest and TimTransmogrifierTest * remove redundant @value annotations from OdeKafkaProperties * add missing AbstractUdpReceiverPublisher slf4j annotation * typo fixes, renames, and comment correction for LogFileToAsn1CodecPublisher and RSUProperties * remove dead/incorrect comment from TimDepositController * pull kafka env vars into OdeKafkaProperties * update SerializableMessageProducerPoolTest to use Spring and JUnit Jupiter for consistent and easier testing * standardize bean injection of RSU in OdeProperties * allow for use of fixed instant clock in DateTimeUtils to enable usage in unit tests * extract RsuProperties from OdeProperties and pull DATA_SIGNING_ENABLED_RSU var into application.yaml * extract SecurityServicesProperties and add SecurityServicesPropertiesValidator validation * test SecurityServicesProperties and add default values from comment in sample.env * implement CustomConversionServiceConfig to provide conversion of empty environment variables for Integer properties * remove unused KAFKA_TYPE from application.yaml * correct warnings in Dockerfile - maintainer -> label and no-cache * add **/.*iml to .gitignore * add validation to OdeKafkaProperties after runtime exception was thrown during testing * remove all dead references to OdeProperties from various files * Update README.md and UserGuide.md with new app configuration instructions. I removed the dead/outdated configuration table from UserGuide.md in favor of the self-documenting configuration objects and their Validation classes. This will keep the code and available configuration values up-to-date automatically instead of requiring developers to maintain these both in code and in documentation. * Asn1CommandManagerTest and SecurityServicesPropertiesValidator test updates for better validation * add @import annotation to InetPacketSenderTest to ensure EventLogger is available during test runs * correct the OdeKafkaProperties validation * remove unused ppm.properties file * undo duplicate code extraction into function - makes code less clear, and is untested, so best to return to how it was * undo star imports in SnmpSession * use assertThrows instead of catching and asserting exception class in TimTransmogrifierTest * correct typo in TimDepositController comment, remove getters and setters from SerializableMessageProducerPool * add slf4j annotation to ImporterProcessor * use the correct environment variable names for tim-ingest-monitoring in application.yaml * add @import annotation to InetPacketSenderTest to ensure EventLogger is available during test runs * replace EventLogger with slf4j logger in InetPacketSenderTest * replace EventLogger with slf4j logger in InetPacketSender * remove unused imports from InetPacketSenderTest * move OdeTimJsonTMCFiltered topic from OdeProperties to JsonTopics after merge * move compression type into OdeKafkaProperties * use defaults from MessageProducer in OdeKafkaProperties for consistency * OdeTimJsonTopologyTest remove unnecessary public access modifiers * removed old git artifact RSUProperties superseded by RsuProperties
SSM Processing Fix
* Migrate kafka configuration values to ODEKafkaProperties This is the first part of the work needed to separate the OdeProperties into multpile more manageable chunks. It introduces the usage of application.yaml and @Autowired in tests to more fully utilize the Spring framework. * remove @component annotation and accept linter errors Adding the @component annotation to the *Receivers caused double instantiations and port binding issues * migrate from application.properties to application.yaml and introducde host-ip config val * add make restart target for easier, faster restarts that don't require a full image rebuild * add missing getHostId test for odekafkaproperties * move odekafkaproperties to kafka package * correctly declare restart target as PHONY in Makefile * move OdeKafkaPropertiesTest to correct test package * add tests to OdePropertiesTest for hostIP and remove unused imports and variables * moved hostID from OdeKafkaProperties to AppContext to keep scope of new class small and directed * correct application.yaml's disabled-topics block * correct formatting in Asn1*JSONTest.java files and fix Sonarlint errors All swaps of argument positions in the mentioned tests are to align the function calls to the defition: assert*(expected, actual). Before this swap, the call was made in the tests like assert*(actual, expected) which has the same outcome when tests *pass* but is confusing when tests do not pass because the failure output will have the actual and expected values swapped in the message. * replace complex if-else nest with simpler switch statments in decoded data router * extract BSM properties from OdeProperties * correct application.yaml indentation for bsm block * reformat GenericReceiver * if-else to switch and pull logic out of nested if-block where possible * define topics structure in application.yaml * convert MessagePublisher to interface and implement in existing publishers * ImporterDirectoryWatcher properties and necessary refactors for related classes * Complete refactoring of UdpServicesController w/ intro of UDPReceiverProperties * add sprint.http.multipart config values to application.yaml * Add TODO for @mcook42 to update README.md with corrected instructions once OdeProperties refactor is completed * remove remaining references to kafka properties from OdeProperties * remove dead test code and add missing constructor param to ImporterDirectoryWatcherTest.java * finalize application.yaml topics structure and config classes * asn1decoder flows updated with new configuration objects * asn1encoder flows updated with new configuration objects * FileSystemStorageService refactor to use new configuration properties * delete unused BSMReceiverPropertiesTest.java * File-related controllers and watchers corrected for new configuration objects * Update TimDepositController with new topics configs * ByteArrayPublisher correct config usage * TimTransmorgrifierTest corrections * LogFileToAsn1CodecPublisherTest corrections * Revert "TimTransmorgrifierTest corrections" This reverts commit 2a27569. * Trivial JMockit test fixes via \@Inject annotations * update test application.yaml * test application.yaml and OdeKafkaPropertiesTest correction * explicitly initialize OdeDataPublisher in OdeDataPublisherTest to avoid JMockit limitations with Set mocking * Adjust ImporterDirectoryWatcherTest.java to confirm directory creation instead of trying to run .run() It doesn't make much sense to test the scheduledExecutor when we already have a test suite for the ImporterProcessor called by the scheduled executor. JMockit doesn't play nicely (easily) with Spring configuration objects, so in the interest of time I refactored this test suite to confirm that we create the necessary directories without the use of JMockit. * OdeStringPublisherTest corrections to avoid JMockit Set mocking limitations * Replace unintentionally deleted license header in MessagePublisher * remove unnecessary comments from UdpServicesController * added missing tests for new configs in kafka package * reformatted LogFileToAsn1CodecPublisher for readability * pull RSUProperties up and refactor tests for clarity in OdePropertiesTest and TimTransmogrifierTest * remove redundant @value annotations from OdeKafkaProperties * add missing AbstractUdpReceiverPublisher slf4j annotation * typo fixes, renames, and comment correction for LogFileToAsn1CodecPublisher and RSUProperties * remove dead/incorrect comment from TimDepositController * pull kafka env vars into OdeKafkaProperties * update SerializableMessageProducerPoolTest to use Spring and JUnit Jupiter for consistent and easier testing * standardize bean injection of RSU in OdeProperties * allow for use of fixed instant clock in DateTimeUtils to enable usage in unit tests * extract RsuProperties from OdeProperties and pull DATA_SIGNING_ENABLED_RSU var into application.yaml * extract SecurityServicesProperties and add SecurityServicesPropertiesValidator validation * test SecurityServicesProperties and add default values from comment in sample.env * implement CustomConversionServiceConfig to provide conversion of empty environment variables for Integer properties * remove unused KAFKA_TYPE from application.yaml * correct warnings in Dockerfile - maintainer -> label and no-cache * add **/.*iml to .gitignore * add validation to OdeKafkaProperties after runtime exception was thrown during testing * remove all dead references to OdeProperties from various files * Update README.md and UserGuide.md with new app configuration instructions. I removed the dead/outdated configuration table from UserGuide.md in favor of the self-documenting configuration objects and their Validation classes. This will keep the code and available configuration values up-to-date automatically instead of requiring developers to maintain these both in code and in documentation. * Asn1CommandManagerTest and SecurityServicesPropertiesValidator test updates for better validation * add @import annotation to InetPacketSenderTest to ensure EventLogger is available during test runs * correct the OdeKafkaProperties validation * remove unused ppm.properties file * undo duplicate code extraction into function - makes code less clear, and is untested, so best to return to how it was * undo star imports in SnmpSession * use assertThrows instead of catching and asserting exception class in TimTransmogrifierTest * correct typo in TimDepositController comment, remove getters and setters from SerializableMessageProducerPool * add slf4j annotation to ImporterProcessor * use the correct environment variable names for tim-ingest-monitoring in application.yaml * add @import annotation to InetPacketSenderTest to ensure EventLogger is available during test runs * replace EventLogger with slf4j logger in InetPacketSenderTest * replace EventLogger with slf4j logger in InetPacketSender * remove unused imports from InetPacketSenderTest * move OdeTimJsonTMCFiltered topic from OdeProperties to JsonTopics after merge * move compression type into OdeKafkaProperties * use defaults from MessageProducer in OdeKafkaProperties for consistency * create POC test for MapReceiver to prove we can send UDP message in Integration Tests * can produce to the correct kafka topic using a test container in MapReceiverIntegrationTest * OdeTimJsonTopologyTest remove unnecessary public access modifiers * Use embedded kafka to confirm UDP map receiver sends message to topic.OdeRawEncodedMAPJson via kafka * Read in test cases from file and compare produced messages with expected inputs. tests not passing and need static time provider * naively pull TestCase into own class with static deserialization method * throw InvalidPayloadException when decoding for better debugging and code flow. use slf4j log annotation * completed initial approval testing for MapReceiverIntegrationTest. Want to add more tests first * add Valid MAP with minimum data test case and tidied up test code * use EmbeddedKafka spring annotation to allow Spring to manage the Bean lifecycle between test runs * use EmbeddedKafka in Asn1Decode*JSONTests to reduce runtime from ~60s each to >1s * undo changes to testing log levels * set static schema version in setup method to make test runs consistent between IDE and cli * rename to MapReceiverTest for consistency with other kafka-using tests * remove unused test-containers dependencies from pom.xml * undo debugging changes in MapReceiver and MapReceiverTest * removed unused csv test data * add JSONEncodedMAP_to_Asn1DecoderInput_Validation.json as valid input data for approval testing thanks to @drewjj * introduce SupportedMessageType enum and throw StartFlagNotFoundException instead of returning magic string from stripDot2Header * removed old git artifact RSUProperties superseded by RsuProperties * add Approval Tests to Asn1DecodeMAPJSON flow * improve javadoc for stripTrailingZeros * renamed TestCase.java to ApprovalTestCase for clarity of intent and moved TestUDPClient to new testUtilities package * OdeTimJsonTopologyTest remove unnecessary public access modifiers * use static constants for topic names in Asn1DecodeMAPJSONTest * added new approval tests for Asn1DecodedDataRouterApprovalTest suite * added approval tests for Asn1DecodedDataRouter - MapTx and MapJson * unify tests in Asn1DecodedDataRouterApprovalTest and prepend TEST to topic names in Asn1DecodeMAPJSONTest for unique topic naming to avoid topic name conflicts across tests * Set kafka broker address to localhost:9093 in test application.yaml to ensure EmbeddedKafka broker doesn't fail to start up when a kafka broker is already running in a dev container on 9092 * Set kafka broker address to localhost:4242 in test application.yaml to ensure EmbeddedKafka broker doesn't fail to start up when a kafka broker is already running in a dev container on 9092 and 9093 * use asn1 encoded hex strings as inputs for MapReceiverTest instead of plain json objects
Co-authored-by: Matt Cook <[email protected]>
Fixed JSON annotations to enable ODE processing of circle geometries
…AsnConverterTest`
* unify tests in Asn1DecodedDataRouterApprovalTest and prepend TEST to topic names in Asn1DecodeMAPJSONTest for unique topic naming to avoid topic name conflicts across tests * add spring.application.name to application.yaml to support Spring Boot kafka consumer and producer automatic naming * Add Asn1DecodeMAPJSONListener and KafkaConsumerConfig to use Spring Kafka listener for Asn1DecodeMAPJSONTest. Test is not passing, but is proving that the new listener consumes the message as expected * comment out outputSchemaVersion in OdeProperties and add Import annotation for BuildProperties.class to allow usage of SpringBootTest annotations. BuildProperties wasn't found unless mvn package was run prior to testing. That's slow and not intuitive, so the annotation fixed the issue. Commenting out outputSchemaVersion allowed for setting OdeProperties as a Component instead of a Configuration class, since that's essentially what it is, and we don't need output schema to be served from a bean in the long run. It can be used as a Value annotated field wherever needed instead * Use EmbeddedKafkaHolder for single EmbeddedKafka instance across test cases and migrate Asn1DecodeMAPJSONTest to use it * Implement KafkaProducerConfig and XMLOdeObjectSerializer to support producing OdeObjects. Not correct for use case, but does prove we can use KafkaTemplates alongside hand-rolled producers * Set kafka broker address to localhost:9093 in test application.yaml to ensure EmbeddedKafka broker doesn't fail to start up when a kafka broker is already running in a dev container on 9092 * Set kafka broker address to localhost:4242 in test application.yaml to ensure EmbeddedKafka broker doesn't fail to start up when a kafka broker is already running in a dev container on 9092 and 9093 * set kafka.consumer and kafka.producer properties in application.yaml to defaults from MessageConsumer and MessageProducer to enable correct startup connections. This proves we can simultaneously run Spring Kafka with our hand-rolled kafka implementations! * increase TimIngestTrackerProperies default interval from 1 to 10 seconds for test runs to reduce log noise * remove Asn1DecodedDataRouter from Asn1DecodedDataRouterApprovalTest since SpringBootTest loads the beans into the context automatically now * use asn1 encoded hex strings as inputs for MapReceiverTest instead of plain json objects * replace hand-rolled producer in MapReceiver with KafkaTemplate * simplify test setup by using SpringBootTest annotation and the EmbeddedKafkaBrokerHolder in MapReceiverTest * use the correct test group name when creating test consumer in MapReceiverTest * comment out MAP routing in Asn1DecodedDataRouter to prove Asn1DecodedDataRouterApprovalTest fails before implementation of spring kafka consumer-producer * Move all Topics configuration objects into subpackage named topics * Move Asn1DecodeMAPJSONListener to kafka.listeners package * Add UnsupportedDataTypeException to prevent acknowledgment of MAP consumption by Asn1DecodedDataRouter * Add filtering consumer factory to support processing of MAP messages in Asn1DecodedDataListener (also implemented) By filtering out messages that are not MAP messages, we can allow for processing of non-map messages via the current Asn1DecodedDataRouter paths while also allowing Asn1DecodedDataListener to process the MAP messages from the same topic (topic.Asn1DecoderOutput) * remove commented out code and update javadoc for filter strategy in KafkaConsumerConfig * remove routeMAP in favor of explicitly throwing UnsupportedDataTypeException * test application.yaml bootstrap-server port 9093->4242 to match tests * properly serialize data before publishing in Asn1DecodeMAPJSONListener * moved all TopicsTests under kafka.topics test package for consistency * moved Asn1DecodeMAPJSONTest to kafka package for consistency with what it's testing * use constructor injection instead of field injection AND constructor injection in Asn1DecodeMAPJSONListener * remove references to Asn1DecodeMAPJSON from Asn1DecodeMAPJSONTest to prepare for deletion of Asn1DecodeMAPJSON * delete Asn1DecodeMAPJSON.java and all references now that we've migrated to the Spring Kafka implementation for this data flow * only create one partition per topic in EmbeddedKafkaHolder * add Awaitility as test dependency to support async testing * replace EmbeddedKafka annotation with EmbeddedKafkaHolder for all Asn1Decode*JSONTests * delete obsolete AsnCodecMessageServiceControllerTest.java. Any test annotated with SpringBootTest will already confirm we can instantiate the AsnCodecMessageServiceController as expected * remove UnsupportedDataTypeException.java from consumption flows and let kafka's consumerGroups manage the separate offsets * inject test queue names with SpringBootTest annotation in Asn1DecodeMAPJSONTest to support unique test topic naming * inject test queue names with SpringBootTest annotation in MapReceiverTest to support unique test topic naming and port assignment * delete disabled AsnCodecRouterServiceControllerTest.java because bean instantiation is already tested by any SpringBootTest annotated tests * make OdeTimJsonTopology a spring Component and inject to properly manage streams lifecycle Before this change, whenever we would run any SpringBootTest tests in sequence we would get an INVALID_STATE error and fail tests unrelated to OdeTimJsonTopology. By turning the OdeTimJsonTopology into a Component we can allow Spring to manage the singleton lifecycle of the OdeTimJsonTopology and its kafka streams. * add consumer ids to listeners and move from property injection to constructor injection in Asn1DecodedDataListener * inject topic names into Asn1DecodedDataRouterApprovalTest to keep topics under test separate between tests * remove unnecessary annotations from Asn1DecodeMAPJSONTest * convert properties to local variable in OdeTimJsonTopology constructor * add KAFKA_TYPE env variable into OdeKafkaProperties * add ConfluentProperties to OdeKafkaProperties * default to empty string for kafkaType in OdeKafkaProperties to prevent NPE errors * replace OdeTimJsonTopology.java with Spring Kafka provided KStream via KafkaStreamConfig The Spring Kafka managed implementation provides SmartLifecycleManagement so that we don't need to manually manage the start/cleanup steps of the kstreams lifecycles. Before this implementation our SpringBootTest integration tests could not run in sequence because the streams would not be in a valid state between tests. * add confluent properties to the consumer and producer configuration conditionally * implement disabledTopic filtering in publish steps of Asn1DecodedDataListener and Asn1DecodeMAPJSONListener * correct indentation in TimDepositController * Revert "correct indentation in TimDepositController" This reverts commit d12d12c. * Revert "replace OdeTimJsonTopology.java with Spring Kafka provided KStream via KafkaStreamConfig" This reverts commit 29a7fa1. * Revert "convert properties to local variable in OdeTimJsonTopology constructor" This reverts commit 9e2b332. * Revert "make OdeTimJsonTopology a spring Component and inject to properly manage streams lifecycle" This reverts commit 1427534. * make streams private final instead of static in OdeTimJsonTopology to better manage lifecycle * use OdeKafkaProperties.Confluent to set stream props for OdeTimJsonTopology to remove reliance on late stage props validation * make stream properties local to OdeTimJsonTopology constructor * pass in topic to OdeTimJsonTopology, add state listener, take start() internal * use Awaitility to await states in OdeTimJsonTopologyTest and remove testQuery as it is not unit testable; it requires an integration test with multiple kafka topic interactions * configure default timeout for Awaitility in Asn1DecodeMAPJSONTest to prevent flakiness * remove slf4j annotation from MessageProcessor where it is not needed * remove setStaticSchemaVersion from OdeProperties in favor of hard coding in OdeMsgMetadata for now * remove null string for key in send method Asn1DecodedDataListener * replace environment variable lookup with configuration value in OdeTimJsonTopology for kafkaType * deleted unnecessary OdePropertiesTest.java * reformat OdeKafkaPropertiesValidatorTest * correctly initialize embedded kafka and needed topics in Asn1Decode*JSONTest files * add missing confluent configuration to odeDatProducerFactory * add missing confluent configuration to odeDataConsumerFactory
Switching adopt builder to temurin
Signed-off-by: dmccoystephenson <[email protected]>
Signed-off-by: dmccoystephenson <[email protected]>
…-bsm.json` Signed-off-by: dmccoystephenson <[email protected]>
…Use5 nullable Signed-off-by: dmccoystephenson <[email protected]>
Signed-off-by: dmccoystephenson <[email protected]>
… explicitly disable calling super Signed-off-by: dmccoystephenson <[email protected]>
Signed-off-by: dmccoystephenson <[email protected]>
…PATs, MAPs & TIMs Signed-off-by: dmccoystephenson <[email protected]>
dmccoystephenson
changed the title
Spring Kafka Integration, J2735 2024 Compatibility, and Developer Experience Enhancements
Spring Kafka Integration, Partial J2735 2024 Compatibility, and Developer Experience Enhancements
Jan 21, 2025
Signed-off-by: dmccoystephenson <[email protected]>
Support Renamed Fields in J2735 2024 BSM Structures
…or-partial-j2735-support Updated 2025 Q1 Release Notes to Reflect Partial J2735 2024 Compatibility
mcook42
force-pushed
the
cdot-release_2025-q1
branch
from
January 22, 2025 18:22
67d3322
to
a9ed9bb
Compare
The DE_Elevation data element represents the geographic position above or below the reference ellipsoid (typically WGS-84). The number has a resolution of 1 decimeter and represents an asymmetric range of positive and negative values. Any elevation higher than +6143.9 m is represented as +61439. Any elevation lower than -409.5 m is represented as -4095. If the sending device does not know its elevation, it shall encode the Elevation data element with -4096.
Signed-off-by: dmccoystephenson <[email protected]>
…ub.com:CDOT-CV/jpo-ode into mcook42/hotfix/set-big-decimal-serialization-2
…s://github.com/CDOT-CV/jpo-ode into mcook42/hotfix/set-big-decimal-serialization-2
…erialization-2 fix: force BigDecimal serialization to NUMBER format
@dmccoystephenson Is this PR ready for review? |
Yes, this PR is now ready for review! @dan-du-car |
Note: The |
radishco
approved these changes
Jan 23, 2025
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
PR Details
Description
Problem
Earlier versions of the JPO-ODE faced various challenges related to functionality, maintainability, and compatibility. One major issue was the integration of a custom Kafka client, which hindered seamless connection with Spring-based applications.
Additionally, the schemas & models needed to be updated to reflect field renamings in J2735 2024.
The developer experience also faced friction due to outdated documentation and insufficient support for development environments, making it difficult for new developers to onboard.
Configuration management was another pain point, with Kafka properties and environment variables tightly coupled, adding unnecessary complexity.
Furthermore, performance bottlenecks, such as potential data loss in UDP payload handling, were present.
Solution
To address these problems, these changes introduced a range of improvements. The integration process was streamlined by migrating from a custom Kafka client to Spring Kafka, providing better compatibility with Spring applications. Additionally, several services, including the Asn1DecodedDataRouter, UdpReceivers, and AsnCodecMessageServiceController, were refactored to leverage Spring Kafka.
These changes also focused on compatibility with renamed fields in J2735 2024 by updating BSM/TIM schemas and POJOs. The schema version has been incremented to 8 to accommodate updates to output messages, including breaking changes in the output of TIMs. Additionally, inconsistencies between the TIM and Map schemas have been resolved.
Developer experience was improved with enhanced onboarding documentation, updated README files, a Makefile, and support for devcontainers. The introduction of a Checkstyle linter and GitHub Actions caching for continuous integration (CI/CD) workflows further boosted development efficiency.
Configuration complexity was reduced by extracting Kafka properties and environment variables into Spring Configuration properties. Additionally, alerts were introduced for missing TIM deposits and environment variables. Performance issues were addressed by adding compression and resolving data loss problems in UDP payload processing, thanks to updates to the UdpHexDecoder logic.
Finally, producer retries and logging were optimized to improve reliability, and a generated TMC TIM topic was added to enhance message handling.
Note on Compatibility Limitations w/ J2735 2024
Schema updates have been applied to increment the version of output messages and provide partial compatibility with J2735 2024.
However, several limitations remain.
These limitations will be addressed in future updates.
Related Issue
No related GitHub issues.
Motivation and Context
These updates address emerging challenges and the growing demand for efficient, scalable transportation data management. Adopting Spring Kafka is replacing the custom Kafka client, simplifying configurations, enhancing integration with Spring, and ensuring scalability for new services.
Compliance with renamed fields in J2735 2024 was a key focus, updating BSM/TIM schemas and POJOs to ensure reliable data ingestion & processing.
Developer experience improvements included refined documentation, better tooling, and streamlined CI/CD pipelines to ease onboarding and boost efficiency. Configuration management was simplified by centralizing Kafka properties and environment variables within Spring Configuration.
Performance and stability enhancements addressed issues like UDP data loss and processing inefficiencies, optimizing components like the UdpHexDecoder and adding compression for better reliability & efficiency.
How Has This Been Tested?
These changes were thoroughly tested to ensure the quality of its improvements. Schema and data flow validation were conducted through approval tests for MAP data flows and schema. Unit tests were executed to validate updates to TIM and Map schemas, resolving inconsistencies.
In terms of integration, tests were run to verify the proper functioning of Spring Kafka integration across several services, such as the UdpReceivers, Asn1DecodedDataRouter, and AsnCodecMessageServiceController.
Additionally, the new logging and producer configurations were tested in a GCP development environment to ensure that they performed reliably under varying conditions.
For developer tools, updates to the README, Makefile, and devcontainer were verified to ensure smooth onboarding, while the configuration of the Checkstyle linter in both local environments and CI/CD pipelines was also validated.
To assess performance and stability, tests were carried out to confirm that the fixes for UDP payload handling eliminated data loss, and regression testing was performed to ensure that Kafka configurations remained backward compatible and stable.
Types of changes
Checklist:
ODE Contributing Guide