Replies: 6 comments 2 replies
-
Some notes about the current implementation:
|
Beta Was this translation helpful? Give feedback.
-
Please review a proposed designed document to address failed emails retries.email retries |
Beta Was this translation helpful? Give feedback.
-
Hello, I'm recapping a meeting that was just had with @kristin-at-theta @Dark-Knight-1313 @benjaminpaige @mdial89f @Valencia2019 @bflynn-cms We spent about an hour discussing various approaches to sending emails. The consensus leaving the call was: High level architecture: Email lambda subscribed to Kafka topic Details below Questions and Considerations:
High level architecture optionsThese options speak to how we're going to organize the appliance that sends email, how it will its template and event data, and how its triggered. Synchronous Invocation from lambdaThis solution invokes SES from the lambda backing the action's API endpoint. For instance, responding to an RAI hits an api endpoint, which triggers a lambda. In this scenario, that lambda would do the email generation based on the event data, and invoke SES to send the email before exiting. Pros:
Cons:
Decoupling with SQS or SNSThis would have the API action lambda doing its business logic for the event, and then sending that event data to an SQS queue or SNS topic. The event would then be processed by an lambda specifically built for email generation and sending, and sent out. Pros:
Cons:
Email lambda subscribed to Kafka topicHere, a lambda specifically built to generate and send emails is created. The lambda is subscribed to the onemac kafka topic, as an event source. When the lambda is triggered by events passed to it by aws, it generates the email from the event data and template, and sends it. Retry would entail taking a past event from package activity and sending it back to the email lambda. Pros:
Cons:
Replay optionsThere is sometimes a need to resend an email for a past event. This can be due to send failure, user error (i deleted the email), or request. These options are how to support resending an email. Storing static email for later recallThis solution would have the lambda emailer saving a copy of the final state, substituted template in a data store, perhaps S3. If an event needs an email resent, an appliance would recall that object and send it. Pros:
Cons:
Regenerating email based on event dataThis solution calls for a past event, like responding to an rai, to have it's data resent to the email lambda, where it's passed into a template and resent. Pros:
Cons:
Addressee List optionsThere's a specific requirement for the old system that we expect to hold for the new. On certain events, all state submitters for the package's state should be notified. The question is how to best meet that requirement. Maintaining a list of all users for a stateHere we would probably tie a lambda to the cognito auth event. When a state submitter logs in, we look at their states, and add their email to the states for which they have access. Dynamo or another store could be leveraged to hold this state. The emailer lambda can recall this list when it needs it. Pros:
Cons:
Querying cognito to get all users for a stateHere the emailer lambda would query the cognito user pool to get the emails of all state submitters for a given state. Pros:
Cons:
|
Beta Was this translation helpful? Give feedback.
-
Ran into a snag.... the submission event does not hold all the required data for the emails to be created. Specifically, the proposedEffectiveDate in the current work. We also are looking ahead and know we'll need the CPOC email, the SRT list, and possibly other original package details for the withdrawal action and RAI response emails. For the completion of the current story... can either:
Long term solution: Use a submission event to trigger the email sending, but query the OpenSearch data to get the details. Because SEA Tool updating can be delayed, makes sense to use the SEA Tool event as the triggering event for the emails. Though this places a dependency on SEA Tool being up for the email to go out, which, since the emails are supposed to help catch problems with SEA Tool, might not be the best idea. |
Beta Was this translation helpful? Give feedback.
-
Also wondering how Mako is determining the "90th Day Deadline." Current OneMAC does a calculation off of Initial Submission Date, but Initial Submission Date is another data element that does not come through the event. I am not a fan of doing it in the email creation, especially if we have a centralized way to calculate the deadline date. I suppose I could put it in the shared-utils? |
Beta Was this translation helpful? Give feedback.
-
There is really good conversation about this subject in: #377 |
Beta Was this translation helpful? Give feedback.
-
I wanted a place to discuss our ideas for how notifications will work in micro.
Background
Notifications are critical to the function of our application. The notifications sent by our app are how CMS personnel will be alerted that something was submitted or assigned. They will notify the state submitter of events, too.
New submissions and all current post submission actions should send a notification/notifications, usually to someone at CMS and someone at the state.
Assumptions
We'll assume that the only end-user notification type is Email. While most of our solutions can support any type of notification, Email is what's historically been used and the only foreseeable ask.
Design Goals
This section will change as we discuss. This is a starting point.
Discussion...
The design goals above and ideas on how to technically approach achieving them are the main parts of this discussion. Input regarding current system pain points and user experience is welcome, and could shift our background and/or assumptions.
Please jot down any and all thoughts or ideas. This is the space for it. Thanks
This space is reserved for the outcome of this discussion. I'm not sure of the form of the output, but I think a list of recommendations with regard to technical approach is a good target.
Beta Was this translation helpful? Give feedback.
All reactions