Onboarding project | Mindflow
đź“„

Onboarding project | Mindflow

Ideal Customer Profile

Disclosure

🤖 Using ChatGPT to analyze long user call transcripts into structured insights

👨‍🔬 Perplexity to quickly gather demographic, company-level, and industry-level info as needed

đź“” Google NotebookLM to help me quickly structure video, articles, and books into short learning bytes


Happy to talk more about this and share my "knowledge building gameplan".

Before we begin:

I would like to present how I'm structuring my research. I dove into the resources provided by the amazing GrowthX team to understand 1.) What information do I need from the users? and 2.) What questions should I ask them?

Let's break it down.

What information do I need from the users?

Source: Based on what I've learnt (so far) in the acquisition and onboarding classes and the additional resources provided.

Extract user insights that reflect:

  • Pains: Frustrations, blockers, bottlenecks to find room for improvement.
  • Delights: Things users love or find surprisingly good and to pinpoint things that can inspire future iterations product value proposition.
  • AHA Moments: Identify key moments where the value of the product “clicked” and the users knew that they always needed Mindflow.
  • Product feedback: Other feedback around must-haves, feature desires, and onboarding experience.

What questions should I ask them?

Sources: As shared in Onboarding post-reads. Eric Migicovsky - How to Talk to Users, The rules for customer interviews, Book – The MOM Test

I was able to derive a detailed question set that would help me drive the conversation while being open-ended to get the maximum out of each user interview.

I've created this as a public Google Doc and you're welcome to use it. You would need to replace the placeholders (content in []) and it would be good to go for your use case.

User Interviews:

The numbers

  1. Total company-level user interviews done = 4
  2. Total number of users included = 23
  3. Industry spread: health-tech, transportation, luxury, and consumer goods

Below, you will see examples of two groups of user interviews I did. The rest are included in final ICPs but not mentioned for data security reasons.


Healthtech Scaleup

User base size: 13

  • Team A (IT Operations): Focused on infrastructure, cloud platforms, devices, and internal tooling.
  • Team B (Security Operations): Focused on threat detection, incident response, and sensitive data protection.

IT Operations Team

Pains

  • Lacked a way to test and promote workflows across environments without manual duplication
  • Access control was frustrating. Role permissions weren’t integrated with company-wide identity tools, making delegation and tracking difficult.
  • Workflow exports were manual, error-prone, and not scalable.

Delights

  • Mindflow's 4,000+ saved significant time compared to building from scratch
  • Real-time alerting on failures or flow activity via existing tools worked well.

AHA Moments

  • Design a single reusable automation and deploy it across various systems
  • Mindflow, as a central gatekeeper for sensitive operations, replacing manual approval processes, was a game changer.

Security Operations Team

Pains

  • Felt the time spent configuring automations wasn’t yet yielding ROI —maybe in the long term.
  • Complex integrations with third-party tools often failed due to missing support for proper authentication guides

Delights

  • Extremely impressed with the AI··Chat's ability to fetch data and take actions across systems French or English or any language
  • Being able to see exactly what the AI··Chat was doing (inputs, operations, and results) "made AI trustworthy."
  • "I loved that fewer technical members could do technical work, such as investigating security issues without needing to write code."

AHA Moments

  • Understood how conversational automation could replace "software-hopping" and "everything is just a chat away"
  • Large-scale datasets could be summarized and questioned to find info that really matters.


Ride-hailing, Transportation

User base size: 10+

Pains

  • Tool-specific workflows make cross-platform processes difficult.
  • Managing groups and permissions across identity and access management tools, corporate tools are inconsistent and time-consuming.
  • Processes are documented but quickly become obsolete as the documentation becomes useless even if the process is changed slightly

Delights

  • Strong appeal in connecting any system with an API and the possibility to integrate new tools if the customer needs them quickly.
  • Multiple ways of triggering automation (scheduling, webhooks, email, and manual runs) match the needs perfectly.
  • Approvals & Audits: Slack approvals and audit logs boost trust and safety and simplify the process.
  • Flow diagrams double as live documentation and are much better than the current manual documentation process


AHA Moments

  • Automation for Edge Cases: Realizing they can handle messy workflows (e.g. HubSpot reassignment, file transfers).
  • Excited by the ability to connect multiple workflows to build a reusable automation library that collaborates
  • Workflows as onboarding tools for new team members by connecting multiple tools in a single automation


Common themes observed

PAINS

  • Manual, repetitive tasks across tools
  • Disconnected systems with no central place to act
  • Limited resources and time to build or maintain automation internally
  • Tool sprawl and monitoring issues, making standardization and oversight difficult

DELIGHTS

  • Single intergration layer that connects all tools
  • No-code, visual flow builder accessible to non-technical members
  • Auditability and clarity in what flows do and how they execute
  • Built-in approval feature set for critical actions

AHA MOMENTS

  • Automating what seemed impossible before
  • Empowering non-technical teams to build and manage workflows
  • Replacing complicated manual monitoring
  • Seeing automation scale across teams and regions, turning local wins into global standards



ICPs

Defining ICPs

  • ICP 1: Enterprise security teams (legacy tools, high compliance needs)
  • ICP 2: Growth-stage tech company (tool rich, resource-constrained)
  • ICP 3: Intermediate IT or security org (regional or business unit level within a large global ICP 1)
  • ICP 4: Modern SaaS-first IT teams (cloud-native, no-code loving)

CriteriaICP 1: Enterprise SecurityICP 2: Growth-Stage TechICP 3: Intermediate OrgICP 4: Modern SaaS IT Team

Company size

10,000+ employees

500-3500 employees

2,000–5,000 employees

200–500 employees

Company stage

Mature, global enterprise

Scaleup, Series C+

Mid-stage org or BU in larger org

Startup or early growth

Funding

Public company

$200 million+

Public or part of a public parent company

$50 million+

Org structure

Multi-level security team, DevSecOps team

Blended team consisting of IT, security, and developer staff

Primarily an IT team with security folks

IT team managing everything

Decision makers

CISO, SOC Director, Security leader

VP Engineering, CIO, CTO, CISO

IT Director or Innovation leader

Head of IT or similar

Decision blockers

Procurement, internal build teams

Prioritization, build vs buy bias

Internal IT constraints, unclear ROI, global HQ validation

Budget, low awareness

Frequency of use case

Daily

Daily to weekly

Daily to weekly

Workflow-based

Goals

Centralization, visibility, reduce redundancy

Streamline internal operations, reduce tool chaos

Reduce manual processes

Empower team to do more in less time

Technical setup

Hybrid cloud, legacy-heavy

Cloud-native, microservices-based

Mix of cloud/on-prem, early infra evolution

Fully cloud-based

Automation maturity

Mixed — some custom scripts, mostly manual

Early-stage but urgent demand

Growing need, mostly scripting right now

Low-code tools or eager to use them

Sales cycle

9–15 months

6–9 months

6–9 months

1–3 months

Annual Budget

$500K+

$100K–$250K

$250K–$500K

<$50K

Motivation

Reduce alert fatigue, standardize processes

Empower fewer people to manage more systems

Cut repetitive effort, show impact

Self-serve capabilities

Organization influence

High — global policies & audits

Medium-high, usually cross-team

Regional or BU-level autonomy

Low to moderate

Preferred Outreach Channels

In person events (CISO forums), partner intros

VC intros, content demos, employee referrals

Internal referrals, leader influencer

Inbound, PLG

Key AHA Moment

Replace brittle code with auditable flows

Slack-triggered workflows that update tools automatically

Cross-tool integration without engineering time

Natural language flows with API integrations


ICP Prioritization


Criteria

Adoption Rate

Appetite to Pay

Frequency of Use Case

​

Distribution Potential

TAM ( users/currency) per customer​

ICP 1

High

Very High

Very High

High

250 users

ICP 2

High

High

Moderate

High

50 users

ICP 3

Moderate

Very High

Moderate

Moderate

150 users

ICP 4

Low

Low

Low

Low

10 users

From the ICP prioritization framework, we can see that ICP 1 and ICP 3 stand out as the most promising targets.

  • ICP 1 shows the highest need for the product, high appetite to pay, very high frequency of use case, strong adoption rate, and a high distribution potential with 250 users per customer.
  • ICP 3 is also compelling due to its high willingness to pay and relatively large TAM per customer (150 users)














JTBD and validation

NEEDS vs. WANTS


What?

Description

In Mindflow's context...

Needs

Must-have capabilities & features to perform their job effectively

Automate repetitive security tasks to save time

Wants

Nice-to-haves that improve convenience, experience, or prestige

Have a modern UI or AI··Chat interface for workflows

đź”§ Core Needs

  • Automate repetitive manual tasks to reduce operational burden
  • Aggregate and correlate data across tools to reduce alert fatigue and scattered information
  • Improve visibility and control across security, IT, and engineering workflows
  • Integrate easily with existing stacks
  • Support decision-making with triggers, approvals, and human-in-the-loop automation
  • Scale automation without code, enabling non-engineers to design and run processes

✨ Common Wants

  • Use AI··Chat to interface with systems
  • Trigger automations from Slack, Notion, email, or other familiar tools
  • Collaborate across teams in automation workflows
  • Avoid vendor lock-in; use multiple SaaS tools together

Value = Mindflow helps teams do more with less—faster, safer, and without code.


JTBD & Validation


I'm skipping the validation approach column as its user interview for all cases.


Goal Priority

Goal Type

ICP

JTBD

Customer validation

Primary

Functional

ICP 1

Standardize and automate repetitive SecOps workflows across a fragmented toolset to save time and reduce errors.

We spend hours copying security alerts between platforms. It’s manual and tedious.

Primary

Functional

ICP 2

Enable smaller teams to automate workflows across their tools without needing engineering resources.

We have ideas for automation but no time or skill to build them.

Primary

Functional

ICP 3

Give IT/security teams a way to industrialize key workflows like onboarding/offboarding across SaaS tools.

We want one place to handle onboarding, offboarding, and permissions – not 15.

Secondary

Emotional

ICP 3

Feel confident that critical processes are correctly handled, even across complex tools.

I worry we’ll forget to revoke access after an employee leaves, which could be a security blunder.

Secondary

Social

ICP 1 & 2

Show internal leaders (CIO, CISO, execs) that automation initiatives drive ROI and feel valued in my team.

I will progress faster if I can prove time savings and security gains.






Onboarding Teardown

Before we begin:

Mindflow is a whitelist only SaaS. This means users can't just go to the website → create an account, → start using the platform. There are 2 main reasons for this:

  1. Mindflow is enterprise only at this point and requires some onboarding setup (usually 2 calls, 30-60 mins each)
  2. Mindflow is built for security teams, and every client needs certain customizations

That said, Mindflow plans to release a community version of the platform —especially the AI··Chat for generalist audiences in Q3'25.


About the teardown

  • The onboarding flow below represents that of an enterprise customer
  • There are quotes in almost all screenshots to help understand what the users said about that particular screen
  1. Evaluate your onboarding on the cognitive biases.


Mindflow onboarding teardown

Website

#1 Homepage – Hero Section


MF_Onboarding_Web_1.png

  • What's working:
    • Clear CTA and a good blend of product on the website
    • Recognizable logos for social proof
    • Quantified impact to get a grasp of potential ROI
  • What's not working:
    • The CTA is —seemingly— one of the least attention-grabbing parts of the website
    • The most common feedback is that the HERO section looks too busy
  • Changes:
    • Make CTA, logos, and numbers a bit bigger
    • The product visual should be simpler
    • Change "Watch a video" → "See how it works"
    • Make the menu bar's font bold and a bit bigger

Bias at play

Cognitive load: The screen appears to have a lot of information —visual and text— making it difficult for visitors to comprehend the objective, especially the ones trying to quickly grasp the value of the product.


Product

#1 Login, Signup Page

The login pages can be accessed once the customer.

MF_Onboarding_App_1.png

  • What's working:
    • Simple and straightforward
    • SSO — right now, you can see Google only, but it's relevant for each customer → creates more trust
  • What's not working:
    • No other visual element creates anticipation or gives a glimpse of what's coming

Bias at play


Progressive disclosure: The minimal login screen has the user's preferred SSO login option. Reducing the information load and keeping additional information for when it matters, i.e. next steps


#2 Post login, first screen

This is the first screen users see when they've signed up for the first time.


MF_Onboarding_App_3.png

  • A few things that are working well here:
    • Personalized greeting: Every user sees their name upfront with the Mindflow logo, creating relatability
    • Value proposition showcase, promising to deliver on the 3 key user pains
    • There's a little "We đź’ś customername" text at the bottom left —to create more joy
  • AHA Moments:
    • Most users mentioned their "love" for this screen and said they were assured they were in the right place

Biases at play


Endowment Effect: The use of the user's name front & center ("Hello Aditya Gaur 👋") and customized copy “We 💜 [customername]” inspires a feeling of ownership over the experience.

minimal login screen

Aha! Moment: Users experience clarity here — seeing that the product understands their needs and is already structured to resolve their pain points automate, integrate, and secure.


#3 Product – Homepage

The user is now inside the usable part of the product.


MF_Onboarding_App_4.png

  • What's working here:
    • A few basic flows are already built for the user. These flows are designed to require only basic, non-technical setup so that the user can get a preview of the value of the product
    • The environment (folder/project) name has been personalized for the user
  • What's not working:
    • The menu shelf is closed by default —for a first-time user, this could lead to confusion on where to go and where to find what they're looking for

Biases at play


Endowment Effect: Seeing “Aditya Gaur’s environment” gives users a sense of ownership over their experience in Mindflow, potentially increasing perceived value.


Goal Gradient Effect: With three simple starter flows visible, users often told me they felt like they were on a clear path to their automation journey with Mindflow.


#4 Product — first flow

When the user clicks on the first flow, they'll see this.


MF_Onboarding_App_5.png

  • What's working:
    • A visually appealing yet simple workflow example —make the product approachable
    • Well-formatted notes and documentation —addressing another user pain very early on
  • AHA moment:
    • The user gets a first taste of the no-code UI that will replace the heavy scripting and coding i.e. their current workflows


#5 Product — second flow (AHA)

This is the second example flow; the user learns what native integration looks like. We have intentionally used "VirusTotal" —a free tool every security team uses in some manner.


MF_Onboarding_App_6.png

  • What's working:
    • The user sees a native integration at work, and it's a name they recognize —trust building and relatability
    • The user also sees that the customizing workflows are actually done in natural language and not code —every type of user knows they can pull it off with Mindflow
    • This the users first look at the "dynamic data pill" —this enables them to bring in data from other tools or steps in the workflow using a "slash command" —again, more functionality which was previously a few lines of code
  • AHA moment:
    • A lot of functionality with natural language, smart design, and quick functions

Biases at play


Aha! moment: The first flow shows users the true power of Mindflow’s no-code platform, replacing the mental load of complex scripting with a clean and visual interface.


Aesthetic-Usability Effect: The flow is visually appealing and well-structured. Because it looks intuitive, users are more likely to think it’s easy to use — even before interacting deeply.


#6 Product — AI··Chat

Mindflow's latest feature. When users see it for the first time, they know what it is about. Everybody knows ChatGPT, and this made our job easier. When a new user opens AI··Cha,t they know what it is about.


MF_Onboarding_App_7.png

  • What's working:
    • Clear value proposition
    • Few ready-to-use prompts tagging some common tools to guide users what they can achieve
  • What's not working:
    • A guide tip to tell users how they can prompt or query their tools is missing
    • Additonally, AI··Chat requires some additional setup, which is not communicated to the user inside the product —this created a poor experience for the user
  • Changes:
    • Implement a quick walkthrough or chat bubble through Intercom when the user lands in AI··Chat for the first time


#7 Product — Integrations (AHA)

This is Mindflow's integration library. During the user interviews, the integration library consistently became the biggest AHA moment across all ICPs.

MF_Onboarding_App_8.png


MF_Onboarding_App_9.png

  • What's working:
    • Native integration for over 4,000 tools —full functionality for all of the customers workflows
    • Easy configuration: Just toggle on or off

Biases at play


Social Proof: Recognizable products' logos (ex. OpenAI, Notion, Slack, HubSpot) in the integrations catalog validate the product's credibility.


Curiosity Gap: Seeing a ton of integrations encourages users to explore further —“what else can I connect?”

#8 Product + Web — Templates

All new users can find over 250 ready-to-automate templates on Mindflow's website. Here's how it works:


CleanShot 2025-03-29 at 10.41.31.gif

  • What's working:
    • Direct access to all templates from the platform
    • Simple copy-paste flow to bring the automation into user environment
  • AHA moment:
    • "I was able to find 60% of my use cases in templates, for the rest I was able to copy-paste and do minor edits to use them"


AHA Moments

There were 2 core AHA moments in the user onboarding journey.

Integrations Catalog

  • The user loved having access to a lot of integrations with native (full) functionality. This meant that their workflows wouldn't be impacted if they changed an existing tool or adopted new tools.
  • The integrations catalog is a winner for Mindflow and has often swayed deals in our favor.

Templates

  • Automating complex workflows with multiple tools, operators, and conditions is difficult. The templates library made the "time to value" much faster for new and existing users.
  • This also means, users can build a flow once and copy paste it across environment without any hassle while ensuring complete data security.


Activation metrics

Before we begin:

  • Mindflow hasn't yet implemented a way to understand user activation, retention, and engagement metrics.
  • The metrics have been identified and defined based on Mindflow's features, product events, and user events
  • The implementation will be done using a product like Mixpanel
  • The write-up below is a close approximation of what Mindflow will finally implement across its various customer environments


Activation Metrics

These are the metrics we plan to measure during the onboarding phase. The duration of the onboarding phase is different for each ICP. Also, it depends on the number of tools the customer uses and the number of use cases they want to automate in Phase 1 (critical automation).

Generally, the onboarding phase lasts 15-45 days. Let's assume it to be 30 days.

Key metrics to track for Mindflow:

  • Time to first flow built (T1FB): How long does a user take to create their first automation?
  • Time to first integration (T1FI): How long does a user take to toggle and set up their first integration?
  • Time to first flow executed (T1FE): How long does it take the user to run their first flow successfully?
  • % of Users Reaching more than 3 integrations activated: This is an essential metric as this conveys that a user has understood how integrations work in Mindflow and is now closer to automating more complex workflows. We will measure this.
    • Day 1 (OD1 = onboarding day 1)
    • Day 7 (OD7)
    • Day 15 (OD15)
    • Day 30 (OD30)

Mixpanel events to track: I have mentioned the values for the metric to reflect a positive result.

  • flow_created_id = 1
  • integration_connected_id = 1
  • trigger_success_count = 1
  • user_signed_up = true
  • login > 1


Engagement Metrics

Once the onboarding phase is complete, i.e. the user has reached Day 31 we want to understand:

  • their progressive interaction with Mindflow
  • what integration are they using
  • success and failure rates in automation
  • how users continue interacting with Mindflow after onboarding —logins, edits, chats, etc.

Key engagement metrics for Mindflow:

  • WAU / MAU Ratio: this will help us measure stickiness week over week over months
    • We have already deployed Mixpanel in two customer environments. The WAU/MAU = 38%

image.png

image.png

  • Single user session length: To understand if users are spending meaningful time on the platform? And identify which type of users are spending more time
  • Active flows per customer: To find power users, adopters, experimenters, and more
  • AI··Chat sessions: To understand how is the new feature doing and compare it with other actions for users

Mixpanel events to track:

  • flow_runs > 10
  • chat_interactions vs. flow_runs > 1


Retention Metrics

Just like in the engagement metric, we plan to use daily and monthly retention metrics.

Key retention metrics for Mindflow:

  • D31, D40, D60, D90 Retention
  • Weekly Active Users over time
  • Drop-off rates: How many users abandon the product between sign-up and first flow run or first integration activation?


User journey: milestone funnel

  • Sign-up → First Login → First Flow Built → First Flow Executed → ≥3 Integrations → WAU at Day 30
  • Failure at any point will be considered "no conversion" and would mean direct engagement from the Customer Success team


Miro board here

image.png


User Cohorts

We plan to implement cohorts in two ways, by ICPs and by use cases. Here's what they will look like:

ICP cohorts:

  • Enterprise
  • Mid-market
  • Scaleup
  • Cloud-native growth
  • Use case (Security, IT, CloudOps)

Use cases cohorts:

  • Sec: Security, SecOps, SOC
  • IT: ITOps, DevSecOps
  • Cloud: CloudOps, Infrastructure, Data
  • Business: Marketing, Sales, Productivity, AI



Activation Score

In addition to individual metrics, we plan to implement an Activation Score — a metric that helps us quickly segment activated & engaged users vs. other users.

A theoretical version of the scoring system is below, original has been hidden for privacy


Behavior

Score

First flow created within 5 days

+2

First integration connected < 7 days

+2

At least 3 flows executed by Day 15

+2

>1 integration used in executed flows

+2

Total time spent > 45 mins in 14 days

+1

Triggered a webhook or used Slack/Teams

+1

In this scoring paradign, we can assume that a score of 7+ signals strong activation.




[object Object],[object Object],[object Object],[object Object]

Brand focused courses

Great brands aren't built on clicks. They're built on trust. Craft narratives that resonate, campaigns that stand out, and brands that last.

View all courses

All courses

Master every lever of growth — from acquisition to retention, data to events. Pick a course, go deep, and apply it to your business right away.

View all courses

Explore foundations by GrowthX

Built by Leaders From Amazon, CRED, Zepto, Hindustan Unilever, Flipkart, paytm & more

View All Foundations

Crack a new job or a promotion with the Career Centre

Designed for mid-senior & leadership roles across growth, product, marketing, strategy & business

View All Resources

Learning Resources

Browse 500+ case studies, articles & resources the learning resources that you won't find on the internet.

Patience—you’re about to be impressed.