-
Notifications
You must be signed in to change notification settings - Fork 2
PO SAAS
The existing system involved a purchasing / procurement system sending a file of purchase orders and line items to a fulfillment request management system. The request management system processed these requests and subsequently generated requests in the enterprise request management system where they were worked to completion. The new enterprise request management system is far more capable than the existing system and the decision has been made to integrate the purchasing system directly to the new enterprise request management system. Like most systems in the real world, very little documentation exists relating to the existing integrations and the development teams need to discover, design and prototype a new integration with the goal of delivering a design document to be use by subsequent development teams.
The existing fulfillment request management system was written in 1999 and took its feeds from mainframe and mid-range class machines which predominately used fixed field length files. Little documentation existed for this feed so sample files were copied to the development system and with what documentation existed and by looking at source code, a data profile was developed for the existing feed.
Using the existing data model as a starting point, the design team developed a new model the purchasing system could generate and place in an internal SFTP file drop on a regular basis. The toolkit was used to create a new batch configuration which connected to the SFTP site, read in the records and generated a data profile and data quality report.
The data quality report involved applying validation rules to the data stream and notifying the purchasing system team when data validation rules were violated. This gave the design team important information as to the amount of validation the integration components needed to perform. This was an iterative process, data from the purchasing system would be sent regularly, the toolkit would be used to inspect the data, generate validation exceptions and feedback was given to the purchasing system team for remediation. The result was a configuration file with simple validation rules which were placed in the design documentation. The design documentation now contain the physical data model of the source system and its validation rules.
At this point, the file drop integration was in a prototype stage. Problems with batch jobs, file transfers, account access / permissions error reporting and notifications were resolved and data was flowing from the purchasing system through the SFTP servers and into a proxy (the Coyote DX job) of the enterprise request management system. Using a few different writers in the batch job, the team began to experiment with web service calls into the development instances of the enterprise request management system to test different options of creating fulfillment requests. The results were documented and used by the design teams in the creation of the production systems.
The development team for the enterprise request management system had enough validated design information and a working prototype from which to quickly (a single two-week iteration) create a production integration with only 2 Quality Control defects, both related to problems with the source system sending invalid data. New validation rules added to the prototype, the documentation updated and the fix entered the next iteration.
- Concepts
- Features
- Transform Engine
- Quick Start
- Configuration
- Secrets Vault
-
Readers
- List of Readers
- Custom Readers
-
Writers
- List of Writers
- Custom Writers
-
Filters
- Accept
- Reject
- Custom Filters
-
Tasks
- List of Tasks
- Custom Tasks
-
Validators
- List of Validators
- Custom Validators
-
Listeners
- List of Listeners
- Custom Listeners
-
Transforms
- List of Transforms
- Custom Transforms
- Mappers
- Context
- Databases
- Templates
- Logging
- Encryption
- Usage
- Expressions
- Examples