Impact Analysis: track how first-time engagement changes user behavior

  • Updated

This article will help you:

  • Discover how firing one event can effect the frequency at which your users fire other events
  • Effectively and appropriately use causal inference

With Amplitude's Impact Analysis chart, you can discover how first-time engagement with one feature affects the rate of another behavior.

For example, a product manager of a music app can use Impact Analysis to see changes in the average number of times users play a song after they first discover the ability to 'favorite' songs:

impact analysis 1.png

Use the Impact Analysis chart to:

  • Learn whether discovering a feature for the first time changes how often users take another, specific action
  • Determine if users who interacted with a new or changed feature are taking certain actions more frequently, relative to the time before they first used the new feature

NOTE: This feature is available only to customers on Enterprise, Growth, and Scholarship plans.

Before you begin

First and foremost, events will not appear in any Amplitude charts until instrumentation is complete, so make sure you've got that done. You'll definitely want to read our article on building charts in Amplitude, as this is where you'll learn the basics of Amplitude's user interface. You should also familiarize yourself with our helpful list of Amplitude definitions.

Finally, when working with an Impact Analysis chart, always keep in mind that correlation does not imply causation. 

Set up an Impact Analysis chart

To build an Impact Analysis chart, follow these steps:

  1. In the Events Module, select a treatment event. This is a user action that you believe may affect your users' propensity to take some key action within your product. You can select up to three treatment events.
  2. Next, select the outcome event. This is the behavior you think may have changed after users triggered the treatment event for the first time. You can select up to three outcome events.
  3. If desired, add properties to your events by clicking on + where, selecting the property name, and specifying the property value you’re interested in.
  4. In the Segmentation Module, identify the user segment you want to include in this analysis. You can import a previously-saved segment by clicking Saved Segments and selecting the one you want from the list. Otherwise, Amplitude begins from the assumption that your analysis will target all users.
  5. If you do not want to import a previously-saved user segment, you can start building your own by adding properties. To do so, click + where, choose the property you want to include, and specify the property value you’re interested in.
  6. You can narrow your focus even further by telling Amplitude you only want to include users who have already performed certain actions. To do so, click + perform, then choose the event you’re interested in.
  7. Use the date picker to specify the timezone, and to set the interval and timeframe for your analysis. This will specify the window of time during which Amplitude will find all users who triggered the treatment event for the first time.

    Here, "first time" is defined as the first time the user has triggered the treatment event in the number of calendar days before the beginning of the selected date range. This number will depend on the time interval you've chosen:
      • Daily: 90 calendar days
      • Weekly: 91 calendar days (or 13 weeks)
      • Monthly: 120 calendar days (or 4 months)
      • Quarterly: 360 calendar days (or 4 quarters)

For example, if you set the timeframe to be between 10/15/2022 and 11/18/2022, with a weekly interval, the users included in the results would be all of those who triggered the treatment event within that time who had NOT previously done so at any point between 7/17/2022 and 10/15/2022 (91 calendar days before the beginning of the selected time window). 

Interpret your Impact Analysis

The Impact Analysis chart plots the outcome event on a relative n-day basis, from the time each user triggered the treatment event for the first time. Amplitude lines up each user's relative timeline for you, so you can easily see the pattern. The center line represents the day or week the users first triggered the event.

impact analysis 1.5.png

In the above example, you can see that the users who favorited a song for the first time between November 1st and November 30 played an average of just over three songs or videos per day in the week after they first tried favoriting. By contrast, those users only played an average of around two songs per day in the week before they discovered the favoriting feature.

Choose your metric

Amplitude gives you a choice of four different metrics when viewing an Impact Analysis chart: average, active percentage, frequency, and properties.  

impact analysis 2.png

Average

When using Average, the chart's Y axis will show the mean number of times that users triggered the outcome event in each n-relative-day/n-relative-week interval. Only users who triggered the event at least once are counted. Hover over each data point to see how many users triggered the outcome event at least once in each interval.

Active %

Here, the Y axis will represent the percentage of people who triggered the outcome event at least once in each n-relative-day/n-relative-week interval. Users who triggered any active event in that interval will be included. Hover over each data point to see how many users triggered the outcome event at least once in each interval.

In the example below, 160,836 users were active the day after favoriting a song for the first time; 85.1% played a song or video. 

impact analysis 3.png

Frequency

With this metric, the chart's Y axis will show the distribution of the number of times people triggered the outcome event at least once in each n-relative-day/n-relative-week interval.

In the example below, 15,085 users played four songs or videos on the seventh day after favoriting a song for the first time.

impact analysis 4.png

Properties

The Properties metric allows you to compute either the average or sum of an event property for a given outcome event. These calculations will encompass every instance of that outcome event triggered in each n-relative-day/n-relative-week interval. For example, you could plot the average length of all songs or videos played by users in their weeks before and after favoriting a song.

Causal inference interpretation best practices

Impact Analysis helps you to validate hypotheses in order to develop a better understanding of the effects between user behaviors. It is not a replacement for randomized experimentation, which is still the gold standard for determining causal effects. You should think of an Impact Analysis chart as a tool to help you determine where you should focus your experimentation program, in order to help your users engage more successfully with your product.

Here are a few things to consider before making causal conclusions:

  • Alternate hypotheses: Have you thought about other potential actions that users take around the same time they fire your treatment behavior for the first time? These actions might also be contributing to the change in the rate of the outcome behavior. If those alternative actions are instrumented, try creating other Impact Analysis charts using those actions as the treatment event. If the results look similar, you'll need to further investigate how much each treatment event is contributing to the change in outcome, through user research and randomized experiments whenever possible. 
  • User counts: If your outcome metrics show high volatility (changing dramatically between intervals) or a dramatic change relative to the intervals before vs after the treatment, check your user count. A too-small user count can explain these inconsistencies or their magnitudes. A small handful of users can swing the metric in one direction, while large user counts typically have a "smoothing" effect that gives the metric more stability. Be cautious when making conclusions with small user counts, because they don't necessarily reflect a broader pattern.