• This blog post was originally published on 7/07/2017 and was updated on 1/30/2019 to include information about our new and improved Events API

We love it when developers use SparkPost webhooks to build awesome responsive services. Webhooks are great when you need real-time feedback on what your customers are doing with their messages. They work on a “push” model – you create a microservice to handle the event stream.

Did you know that SparkPost also supports a “pull” model Events API that enables you to download your event data for up to ten days afterwards? This can be particularly useful in situations such as:

  • You’re finding it difficult to create and maintain a production-ready micro-service. For example, your corporate IT policy might make it difficult for you to have open ports permanently listening;
  • You’re familiar with batch type operations and running periodic workloads, so you don’t need real-time events;
  • You’re a convinced webhooks fan, but you’re investigating issues with your almost-working webhooks receiver micro-service, and want a reference copy of those events to compare.

If this sounds like your situation, you’re in the right place! Now let’s walk through setting up a really simple tool to get those events.

Design goals

Let’s start by setting out the requirements for this project, then translate them into design goals for the tool:

  • You want it easy to customize without programming.
  • SparkPost events are a rich source of data, but some event-types and event properties might not be relevant to you. Being selective gives smaller output file sizes, which is a good thing, right?
  • Speaking of output files, you want event data in the commonly-used csv  file format. While programmers love JSON, CSV is easier for non-technical users (and results in smaller files).
  • You want to set up your SparkPost account credentials and other basic information once and once only, without having to redo them each time it’s used. Having to remember that stuff is boring.
  • You need flexibility on the event date/time ranges of interest.
  • You want to set up your local time-zone once, and then work in that zone, not converting values manually to UTC time. Of course, if you really want to work in UTC, because your other server logs are all UTC, then “make it so.”
  • Provide some meaningful comfort reporting on your screen. Extracting millions of events could take some time to run. I want to know it’s working.

Events, dear programmer, events …

Firstly, you’ll need python3 ,  pip and git installed on your system.  The easiest way to install is given here and needs just these commands:

For other platforms, this is a good starting point to get the latest Python download; there are many good tutorials out there on how to install.

We’re the knights who say “.ini”

Set up a sparkpost.ini  file as per the example in the Github README file here.

Replace <YOUR API KEY> with a shrubbery your specific, private API key.

Host is only needed for SparkPost Enterprise service usage; you can omit for sparkpost.com.

Events is a list, as per SparkPost Event Types; omit the line, or assign it blank, to select all event types.

Properties can be any of the SparkPost Event Properties. Definitions can split over lines using indentation, as per Python .ini file structure, which is handy as there are over seventy different properties. You can select just those properties you want, rather than everything; this keeps the output file to just the information you want.

Timezone can be configured to suit your locale. It’s used by SparkPost to interpret the event time range from_time and to_time that you give in command-line parameters. If you leave this blank, SparkPost will default to using UTC.

If you run the tool without any command-line parameters, it prints usage:

from_time and to_time are inclusive, so for example if you want a full day of events, use time T00:00  to T23:59 .

Here’s a typical run of the tool, extracting just over 18 million events. This run took a little over two hours to complete.

That’s it! You’re ready to use the tool now. Want to take a peek inside the code? Keep reading!

Inside the code

Getting events via the SparkPost API

The SparkPost Python library doesn’t yet have built-in support for the events  endpoint. In practice the Python requests library is all we need. It provides inbuilt abstractions for handling JSON data, response status codes etc and is generally a thing of beauty.

One thing we need to take care of here is that the events endpoint is rate-limited. If we make too many requests, SparkPost replies with a 429  response code. We play nicely using the following function, which sleeps for a set time, then retries:

Practically, when using event batches of 10000 I didn’t experience any rate-limiting responses even on a fast client. I had to deliberately set smaller batch sizes during testing, so you may not see rate-limiting occur for you in practice.

Selecting the Event Properties

SparkPost’s events have nearly sixty possible properties. Users may not want all of them, so let’s select those via the sparkpost.ini file. As with other Python projects, the excellent ConfigParser library does most of the work here. It supports a nice multi-line feature:

“Values can also span multiple lines, as long as they are indented deeper than the first line of the value.”

We can read the properties (applying a sensible default if it’s absent), remove any newline or carriage-return characters, and convert to a Python list in just three lines:

Writing to file

The Python csv library enables us to create the output file, complete with the required header row field names, based on the fList we’ve just read:

Using the DictWriter class, data is automatically matched to the field names in the output file, and written in the expected order on each line. restval="  ensures we emit blanks for absent data, since not all events have every property. extrasaction=ignore ensures that we skip extra data we don’t want.

That’s pretty much everything of note. The tool is less than 150 lines of actual code.

Moving from message-events to new events API

In January 2019, I updated this tool to use the new Events API, following these migration guidelines. The new endpoint has more powerful search capabilities, but for now we’ve kept this tool functionally identical. Here are some notes on what I needed to change:

  • The events API paging mechanism is different. A cursor parameter is used, with value “initial” for the first page.
  • Instead of incrementing a “page” value, we get the cursor value for the next page from the response JSON data.

Take a look at the changes color-coded side-by-side here (thanks to GitHub). The details:

  • Change the API path from /api/v1/message-events  to /api/v1/events/message .
  • Pass a complete URL into the getMessageEvents  function, rather than building the parameters internally. This is because we get everything we need from the previous response data, making the code shorter (a good thing, right?).
  • Construct the initial URL parameters outside this function, and do it once only, for the first call made.
  • Simplify the link walking code in main (outermost scope). We don’t need to look for a links.rel.next  object any more – it’s in links.next .
  • We still increment event_page each time around, but it’s just used for human-readable comfort reporting.
  • After the first page, we set the passed-in params p  to None , because everything needed for the next call is already fully-formed in the url. The underlying requests library is happy with either format.

It took me around an hour to make those changes and another half an hour to test them. The updated code is shorter by five lines. I then made some general project improvements:

  • Install is now much easier, using pipenv
  • The project repo now uses Travis CI automated tests

You’re the Master of Events!

So that’s it! You can now download squillions of events from SparkPost, and can customize the output files you’re getting. You’re now the master of events!

—Steve Tuck, Senior Messaging Engineer

ps: If you’re looking for more resources on APIs, check out the SparkPost Academy.

Big Rewards Blog Footer