Prerequisites
- You have access to the Data Studio
- You had an overview of your Likelihood to Buy model
- You had a look at the Insights page
How to read the Event weights table?
The Event weights tab section lists the different events and aggregated events, and how much each of them weight in the scoring.
The Event weights table displays the events in the same order as the Lift graph of the Insights page.
- Event: user-friendly name of the event performed by a user, mapped from the behavioral data from your integration (in app.madkudu.com > mapping > event mapping).
- Tips: events like "Email - At least X activities in the last X day(s)" are aggregated events. Learn more.
- Importance (in points): weight of the event
- Lifespan (in days): the decay of the event in days. For example: if lifespan = 90 days, this means that 90 days after doing this event, this event has no contribution to the lead or account likelihood to buy score. The decay maximum value is 90 days.
When clicking on "Advanced mode", the table expands to show the historical data analysis.
You can now see the same 4 sections dividing the events, according to the Did X value:
- The events with the most statistical significance (Did X > 100). This means there is enough data to be able to drive conclusions from the lift. The weights of these events should be the most important.
- then the events with little statistical significance (10 < Did X < 100). This means there isn't enough data to be able to drive conclusions from the lift but it still gives an indication if the events are important or not. The weights automatically suggested are reasonable to keep.
- then the events with no lift available (Did X = 0). This means assigning weights to these events will be taken into account in the model, but you won't be able to see their impact on the model performance. If you don't already know that an event of this section is an important sign of conversion (handraiser event), put a weight of 1.
- then the events with no statistical significance (Did X < 10). This means there isn't enough data to be able to drive conclusions from the lift as to whether or not these events are important. A reasonable approach is to set them at a weight of 1.
Columns related to the Event mapping:
- Activity type: when building the event mapping we categorize events ("meta events") in different segments (Web Activity, Marketing Activity, Product Usage, Sales Activity, Email Activity...)
- Negative User Activity: when building the event mapping we also define if the event is more of a "negative action" (deleted account, unsubscribe from newsletter, declined invitation ...) than a positive event (showing that the user is engaging with the product / company).
- We would typically assign negative weights to negative user activities
Columns related to Historical analysis (based on the people in the training dataset and their activity 3 months before), providing information and guidance on how the factor loading and decay are and should be configured:
- Suggested importance = the recommended weight (in points) to attribute to this event, which is calculated based on the lift.
- Suggested Lifespan = the recommended Lifespan to attribute to this event, which is calculated based on how frequently an event is done. (The more frequent, the lower the lifespan should be)
- Average for converted: how many times was this event performed on average by a person that converted
- Average for non-converted: how many times was this event performed on average by a person that did not convert
- Did X: how many people performed this event
- If Did X >= 100 we estimate that the sample of people is large enough to derive conclusions (statistically significant)
- If Did X < 100 we estimate that the sample of people is too small to derive conclusions
- Did not do X: how many people did not perform this event
- Did X conversion rate: conversion rate of people who did this event
- Did not do X conversion rate: conversion rate of people who did not do this event
- Lift: Ratio between the conversion rate of people who did the event to the overall average conversion rate of the training dataset
- when lift > 0 it means that someone performing this event is more likely to convert
- when lift < 0 it means that someone performing this event is less likely to convert
- Recall conversions: proportion of converters that did this event
- Recall non-conversions: proportion of non-converters that did this event
- Average for converted * factor loading: multiplication of the event weight and the average number of occurrences of this event per conversion. This gives us an idea of how many points would be assigned to someone who usually does this event with that many occurrences.
Only events performed by at least 1 lead in the past 9 months are displayed here.
For example, a Salesforce campaign with no campaign member added in the past 9 months would not show up on this page.
Amongst those, we display a maximum of 500 events.
How are the weights and decays configured?
The weight and decays automatically suggested are derived from a calculation based on the lift of the event, how many people have done this event (did X) and how often is the event performed by converted (average for converted).
However, you can manually change the weights and decays to tweak the model according to your Sales team feedback, your analysis, or business needs.
For example:
- You'd like to make sure that registering to a webinar does not bump the person automatically to a high likelihood to buy, therefore you would want to decrease the weight of the event "Registered to Webinar" to be under the threshold of the medium/high segment (see in Thresholds tab)
- You'd like to make sure that people who requested a demo are scored low after 30 days if they have only performed this action, therefore you would want to decrease the decay of the event "Requested a demo" to 30 days.
MadKudu also recommends setting a weight of 0 to catch-all events. Catch-all events are the events called 'Other [integration] activity', automatically created by MadKudu to receive all events coming from your integration but that you didn't explicitly map in your event mapping.