Skip to content

PO SAAS

Steve Cote edited this page Apr 23, 2017 · 1 revision

Overview

The existing system involved a purchasing / procurement system sending a file of purchase orders and line items to a fulfillment request management system. The request management system processed these requests and subsequently generated requests in the enterprise request management system where they were worked to completion. The new enterprise request management system is far more capable than the existing system and the decision has been made to integrate the purchasing system directly to the new enterprise request management system. Like most systems in the real world, very little documentation exists relating to the existing integrations and the development teams need to discover, design and prototype a new integration with the goal of delivering a design document to be use by subsequent development teams.

Discover Existing Data Models

The existing fulfillment request management system was written in 1999 and took its feeds from mainframe and mid-range class machines which predominately used fixed field length files. Little documentation existed for this feed so sample files were copied to the development system and with what documentation existed and by looking at source code, a data profile was developed for the existing feed.

Create New Data Model

Using the existing data model as a starting point, the design team developed a new model the purchasing system could generate and place in an internal SFTP file drop on a regular basis. The toolkit was used to create a new batch configuration which connected to the SFTP site, read in the records and generated a data profile and data quality report.

The data quality report involved applying validation rules to the data stream and notifying the purchasing system team when data validation rules were violated. This gave the design team important information as to the amount of validation the integration components needed to perform. This was an iterative process, data from the purchasing system would be sent regularly, the toolkit would be used to inspect the data, generate validation exceptions and feedback was given to the purchasing system team for remediation. The result was a configuration file with simple validation rules which were placed in the design documentation. The design documentation now contain the physical data model of the source system and its validation rules.

Prototyping

At this point, the file drop integration was in a prototype stage. Problems with batch jobs, file transfers, account access / permissions error reporting and notifications were resolved and data was flowing from the purchasing system through the SFTP servers and into a proxy (the Coyote DX job) of the enterprise request management system. Using a few different writers in the batch job, the team began to experiment with web service calls into the development instances of the enterprise request management system to test different options of creating fulfillment requests. The results were documented and used by the design teams in the creation of the production systems.

New system development

The development team for the enterprise request management system had enough validated design information and a working prototype from which to quickly (a single two-week iteration) create a production integration with only 2 Quality Control defects, both related to problems with the source system sending invalid data. New validation rules added to the prototype, the documentation updated and the fix entered the next iteration.

Home

  1. Concepts
  2. Features
  3. Transform Engine
  4. Quick Start
  5. Configuration
  6. Secrets Vault
  7. Readers
  8. Writers
    • List of Writers
    • Custom Writers
  9. Filters
    • Accept
    • Reject
    • Custom Filters
  10. Tasks
    • List of Tasks
    • Custom Tasks
  11. Validators
    • List of Validators
    • Custom Validators
  12. Listeners
    • List of Listeners
    • Custom Listeners
  13. Transforms
    • List of Transforms
    • Custom Transforms
  14. Mappers
  15. Context
  16. Databases
  17. Templates
  18. Logging
  19. Encryption
  20. Usage
  21. Expressions
  22. Examples
Clone this wiki locally