Adaptive discounting quickstart
In this quickstart, we'll show you how to:
- Add data representing your transaction events using a dataset.
- Add data representing your signup events using a dataset.
- Describe your Leads group formally using a cohort.
- Create a predictive model for forecasted spend using a forecast.
- Deploy your adaptive discounting predictions using a pipeline
Let's dive in.
Uses prerelease features
This document refers to features which are not yet available for general release: Forecast. Contact support if you'd like to request early access. Screenshots are disabled on this document.Transaction event stream
Add data
Create a transaction event stream
Using CSV as an easy example
This section describes how to add data securely using a CSV file that you export from your data infrastructure.
Most Faraday users eventually update their configuration to pull data directly from their data warehouses, cloud buckets, or databases. To do that, you’ll add your source as a Connection and then choose it below instead of CSV.
For more, see our docs on Datasets and Connections.
- Extract a CSV representing your transaction from your systems, or download the sample transactions file from Farazona. Either way, save it someplace memorable.
First, upload your data with a POST /uploads
request:
curl --request POST \ --url https://api.faraday.ai/v1/uploads/transactions/YOUR_CSV_FILE.csv \ --header 'Accept: application/json' \ --header 'Authorization: Bearer YOUR_API_KEY' \ --header 'Content-Type: application/octet-stream' \ --data-binary "@YOUR_CSV_FILE.csv"
Then use a POST /datasets
request to register the data:
curl https://api.faraday.ai/datasets --json '{ "name": "Transaction", "identity_sets": { "shipping": { "city": "shipping_city", "house_number_and_street": [ "shipping_address1", "shipping_address2" ], "person_first_name": "shipping_address_fn", "person_last_name": "shipping_address_ln", "postcode": "shipping_postcode", "state": "shipping_state" } }, "options": { "type": "hosted_csv", "upload_directory": "transaction_data_files" }, "output_to_streams": { "transaction": { "data_map": { "datetime": "created_at" }, "value": "total" } } }'
Your event stream will start building in the background. You can proceed immediately with the next set of instructions. When your stream is done building, you’ll get an email.
Signup event stream
Add data
Create a signup event stream
Using CSV as an easy example
This section describes how to add data securely using a CSV file that you export from your data infrastructure.
Most Faraday users eventually update their configuration to pull data directly from their data warehouses, cloud buckets, or databases. To do that, you’ll add your source as a Connection and then choose it below instead of CSV.
For more, see our docs on Datasets and Connections.
- Extract a CSV representing your signup from your systems, or download the sample transactions file from Farazona. Either way, save it someplace memorable.
First, upload your data with a POST /uploads
request:
curl --request POST \ --url https://api.faraday.ai/v1/uploads/signups/YOUR_CSV_FILE.csv \ --header 'Accept: application/json' \ --header 'Authorization: Bearer YOUR_API_KEY' \ --header 'Content-Type: application/octet-stream' \ --data-binary "@YOUR_CSV_FILE.csv"
Then use a POST /datasets
request to register the data:
curl https://api.faraday.ai/datasets --json '{ "name": "Signup", "identity_sets": { "shipping": { "city": "shipping_city", "house_number_and_street": [ "shipping_address1", "shipping_address2" ], "person_first_name": "shipping_address_fn", "person_last_name": "shipping_address_ln", "postcode": "shipping_postcode", "state": "shipping_state" } }, "options": { "type": "hosted_csv", "upload_directory": "signup_data_files" }, "output_to_streams": { "signup": { "data_map": { "datetime": "created_at" }, "value": "total" } } }'
Your event stream will start building in the background. You can proceed immediately with the next set of instructions. When your stream is done building, you’ll get an email.
Leads cohort
- You'll need a Faraday account — signup is free!
Confirm your data
Event streams
Unless you’ve already created it for another quickstart or purpose, you’ll need to add the following event stream to your account:
- Signup
What’s an event stream?
Predicting a certain customer behavior requires historical examples of customers exhibiting that behavior. Faraday works best when that data comes in the form of “events” — specific actions or occurrences that happened at specific times.
Formulating data this way helps you define cohorts more expressively.
For example, a Customers cohort could be defined as the group of people who have all experienced a Transaction event at least once.
For more, see our docs on Cohorts, Events, Traits, and Datasets (which define how events and traits emerge from your data).
To verify, use a GET /streams
request. Your response should look like this:
[{ "name": "Signup", "id": "$SIGNUP_STREAM_ID" , ...}]
Make note of the IDs of the necessary streams.
If the required stream isn’t there, follow the instructions using this button, then return here to resume.
Create cohort
Create a leads cohort
Use a POST /cohorts
request:
curl https://api.faraday.ai/cohorts --json '{ "name": "Leads", "stream_name": "signup" }'
Your cohort will start building in the background. You can proceed immediately with the next set of instructions. When your cohort is done building, you’ll get an email.
Forecasted spend forecast
- You'll need a Faraday account — signup is free!
Confirm your data
Event stream
Unless you’ve already created it for another quickstart or purpose, you’ll need to add the following event stream to your account:
- Transaction
What’s an event stream?
Predicting a certain customer behavior requires historical examples of customers exhibiting that behavior. Faraday works best when that data comes in the form of “events” — specific actions or occurrences that happened at specific times.
Formulating data this way helps you define cohorts more expressively.
For example, a Customers cohort could be defined as the group of people who have all experienced a Transaction event at least once.
For more, see our docs on Cohorts, Events, Traits, and Datasets (which define how events and traits emerge from your data).
To verify, use a GET /streams
request. Your response should look like this:
[{ "name": "Transaction", "id": "$TRANSACTION_STREAM_ID" , ...}]
Make note of the IDs of the necessary streams.
If the required event stream isn't there, follow the instructions using this button, then return here to resume.
Configure your prediction
Create a forecasted spend forecast
Use a POST /forecasts
request:
curl https://api.faraday.ai/forecasts --json '{ "name": "Forecasted spend", "stream_name": "transaction", "stream_property_name": "value" }'
Your forecast will start building in the background. You can proceed immediately with the next set of instructions. When your forecast is done building, you’ll get an email.
Adaptive discounting deployment
- You'll need a Faraday account — signup is free!
Confirm your data
Unless you’ve already created it for another quickstart or purpose, you’ll need to add the following cohort to your account:
- Leads
Cohorts
What’s a cohort?
A cohort is Faraday’s term for a commercially significant group of people — for example, a brand’s customers, leads, or even “people who bought X and Y and then cancelled.”
Cohort membership is fluid — continuously computed by Faraday — and is defined by events its members must all have experienced and/or traits its members must all share.
For example, a Customers cohort could be defined as the group of people who have all experienced a Transaction event at least once.
For more, see our docs on Cohorts, Events, Traits, and Datasets (which define how events and traits emerge from your data).
To verify, use a GET /cohorts
request. Your response should look like this:
[{ "name": "Leads", "id": "$LEADS_COHORT_ID" , ...}]
Make note of the IDs of the necessary cohorts.
If the required cohort isn’t there, follow the instructions using this button, then return here to resume.
Confirm your predictions
Unless you’ve already created it for another quickstart or purpose, you’ll need to add the following prediction in your account:
- Forecast: Forecasted spend
Forecasts
What’s a forecast?
A forecast is what you use in Faraday to predict the number and/or value of events like transactions that an individual will experience over a certain timeframe.
For more, see our docs on Forecasts.
To verify, use a GET /forecasts
request. Your response should look like this:
[{ "name": "Forecasted spend", "id": "$FORECASTED_SPEND_FORECAST_ID" , ...}]
If the required forecast isn’t there, follow the instructions using this button, then return here to resume.
Deploy your predictions
Now you’ll create the pipeline necessary to deploy your predictions.
Create a pipeline for adaptive discounting
Use a POST /scopes
request:
curl https://api.faraday.ai/scopes --json '{ "name": "Adaptive discounting", "population": { "include": [ "$LEADS_COHORT_ID" ] }, "payload": { "forecast_ids": [ "$FORECASTED_SPEND_FORECAST_ID" ] } }'
Your pipeline will start building in the background. You can proceed immediately with the next set of instructions. When your pipeline is done building, you’ll get an email.
Deploy your adaptive discounting pipeline
Deploying to CSV as an easy example
This section describes how to deploy your predictions to a CSV file that Faraday securely hosts (and continuously updates) for you to retrieve either manually or on a scheduled basis using your existing data infrastructure.
Most Faraday users eventually update their pipelines to deploy to data warehouses, cloud buckets, or databases. To do that, you’ll add your destination as a Connection and then choose it instead of Hosted CSV.
For more, see our docs on Pipelines and Connections
Use a POST /targets
request:
curl https://api.faraday.ai/targets --json '{ "name": "Adaptive discounting in CSV", "scope_id": "$ADAPTIVE_DISCOUNTING_SCOPE_ID", "representation": { "mode": "identified" }, "options": { "type": "hosted_csv" } }'
Your pipeline will finish building in the background. You can proceed immediately with the next set of instructions. When it’s done, you’ll get an email—then you can return to this pipeline and click the Enable pipeline button to activate it.