For applications which send mail on a regular schedule, there are several PostageApp features to automate this process and offload schduling to PostageApp instead of having to do any queueing work on the source application.
This makes it possible to do recurring sending even when no “crontab”-type facility is available, as is often the case on shared hosting or “serverless” deployments.
The batch schedule system can work with either pre-defined content, or content that will be generated at a future point in time. For pre-defined content it will be necessary to construct a complete API call. For future content a remote resource will need to be loaded to construct the final payload, so some kind of endpoint or resource must be exposed that is accessible to PostageApp.
Each scheduled event can be identified by a UID, an arbitrary, unique string identifier that is used to refer to this scheduled event in subsequent API calls.
A complete send_message
payload needs to be composed.
This can include remote content that will be
fetched each time the scheduled send is executed.
An example payload that will be executed in an hour looks like:
{
"api_key": "__PROJECT_API_KEY__",
"uid": "__UID__",
"schedule": "in 1 hour",
"metadata": {
"arguments": {
"recipients": [ "recipient@example.com" ],
"headers": {
"from": "sender@example.com",
"subject": "Test Email"
},
"content": {
"text/plain": "Example content."
}
}
}
}
The entire API call can be fetched dynamically using remote content by using a base template like:
{
"$ref": "https://example.com/batch-001/send_message.json",
"$type": "application/json"
}
Where the JSON at that destination is loaded and substituted into the document at that location. This can be done at the top level as illustrated here, or at any particular point in the document.
In the case of using create_scheduled_events
then multiple calls can be loaded
in either as definining several calls directly, like this:
{
"api_key": "__PROJECT_API_KEY__",
"events": [
{
"$ref": "https://example.com/batch-001/message-001.json"
},
{
"$ref": "https://example.com/batch-001/message-002.json"
}
]
}
The alternative is to return an array of JSON objects instead:
{
"api_key": "__PROJECT_API_KEY__",
"events": {
"$ref": "https://example.com/batch-001/messages.json"
}
}
Where that document has a top-level array. This is substitution effectively
replaces the { "$ref": "..." }
object with whatever document is fetched from
that location. The final JSON document that’s scheduled looks like:
{
"api_key": "__PROJECT_API_KEY__",
"events": [ "..." ]
}
New schedules can be created either by the “Scheduled” tab available in the
project view, or through the create_scheduled_event
and create_scheduled_events
API endpoints. These are used to create one or multiple events respectively.
Once created these can be followed-up on using other API calls to pull down detailed results and ensure that the scheduled event ran properly.
Each time a scheduled event is executed an instance entry describing the
outcome is created. These can be fetched via the
list_scheduled_event_instances
endpoint for any given scheduled event UID.
Any errors that occur during processing will be noted here along with some diagnostic information to help repair the problem.
Once created a scheduled event can be updated, including the schedule, the
metadata, and the UID all through the
update_scheduled_event
endpoint.
This endpoint can be used to temporarily suspend a scheduled event by updating
the status
property to suspend
. To re-activate the scheduled event, restore
this to active
in a subsequent call.
Note that suspended scheduled events won’t create any instances, and once re-activated will only execute on the next scheduled interval.