webinar ·60 min ·Recorded Feb 2022

Sprints with Evan: Jumpstart your Creative Sprint process

Reza Khadjavi (CEO) and Evan Lee (Head of Creative Strategy) at Motion kick off the first in a workshop series on Creative Sprints — a structured, data-driven process for running performance creative workflows. Evan walks through the fundamentals: why teams need sprint structure, the "hat system" across paid social/management/creative roles, the difference between net-new concept sprints and iteration sprints, and a 5-step process (hypothesis → control group → brief → execute → recap). Live Motion demos illustrate analyzing thumbstop ratio vs click-to-purchase ratio and CTR vs conversion rate to diagnose creative vs landing page issues, with worked examples of Week 1 Sprint Kickoff and Recap.

What's discussed, in order

6 named frameworks

01 Creative Sprints
An ongoing, structured, data-driven approach to performance creative workflow. Two modes: New Concepts (heavy briefing, large output, tied to marketing calendar/launches) and Iteration (light briefing, focused execution, low-quantity asset…
presenter's own, introduced ~8:22Play
02 5 Steps to a Creative Sprint
presenter's own, ~28:54Play
03 Sprint Cadence Timeline
Brief day 1 → Creation 7–14 days → Launch (spend-dependent, typically 7–30 days) → Recap & prep. Total cycle typically 7/14/21/30/45 days.
~29:28Play
04 Hat System for Team Roles
Paid Social / Management / Creative Team roles treated as hats anyone can wear, rather than rigid silos.
~11:49Play
05 AIDA Funnel
Attention, Interest, Desire, Action — used to map creative metrics to funnel stages.
~21:54Play

What's actually believed — in their own words

Over the last few years creative has become more important due to Facebook consolidation, feed competition, and the attention economy (Reza, ~4:09)

· 2022 #

Creative and performance/media buying teams have historically been siloed due to left-brain/right-brain differences (Reza, ~5:17)

· 2022 #

Most performance creative analysis is actually rooted in gut feel rather than true data analysis (Evan, ~9:12)

· 2022 #

The hardest part of any testing process is starting (Evan, ~10:30)

· 2022 #

Without structure, learnings "disappear into the abyss" after tests end (Evan, ~10:43)

· 2022 #

Iteration offers easier wins than net-new concept development because lift is lower (Evan, ~20:07)

· 2022 #

Grouping creative data (by naming convention, post ID, or image hash) is prerequisite to holistic creative analysis (Evan, ~21:06)

· 2022 #

Facebook's "truth" (what its algorithm optimizes on) differs from absolute truth; both matter for decisions (Reza, ~53:36)

· 2022 #

The do's and don'ts pulled from the session

Do this
  • Structure creative work into repeatable sprints with explicit cadence #
  • Start every sprint with a hypothesis rooted in data, not gut feel #
  • Group creative data by concept/theme before analyzing performance #
  • Use thumbstop ratio × click-to-purchase ratio to find iteration candidates #
  • Use CTR × click-to-purchase ratio to diagnose whether an issue is creative or landing page #
  • Define a control group by aligning date ranges with when V2/V3 went live #
  • Memorialize sprint outcomes — both wins and failed hypotheses — as a team knowledge bank #
  • Keep briefs lightweight; ask for small iterations #
  • Establish shared creative metrics across roles so everyone speaks the same language #
  • For small teams: focus on low-lift, high-ROI iterations rather than net-new creation #
  • For large teams: prioritize information flow across silos #
  • Carve out dedicated headspace every sprint cycle (~21 days) to look back #
  • To get agency/client buy-in: lead with data (knowledge = power), start with low-lift asks, report insights to build natural curiosity #
  • Pick a single "source of truth" (GA, Shopify, third-party attribution) and reconcile via UTMs/naming conventions #
  • Use project management tools (Notion, Asana) for brief tracking; Sheets or Motion for data #
Don't do this
  • Running tests and then losing learnings because no one memorializes them #
  • Making creative decisions based on gut feel masquerading as data analysis #
  • Letting creative and performance teams operate in silos #
  • Overloading briefs — asking for too many new assets per sprint #
  • Shooting from the hip week-over-week without a defined cadence #
  • Abandoning Facebook in-platform data entirely in favor of outside attribution #
  • Creating clickbait thumbnails #

Numbers quoted in this talk

Typical sprint cadence ranges: 7, 14, 30, or 45 days (spend-dependent)
2022 · #
Creative creation window: typically 7–14 days post-brief
2022 · #
Example recap: V2 validated hypothesis (higher thumbstop, maintained CVR); V3 (clickbait thumbnail) failed hypothesis
2022 · #

Everything referenced on-screen and by name

People mentioned (non-speakers)

  • Roarke (Logical Position) — attendee greeted in pre-show
  • Sarah Jane — attendee
  • David S — attendee
  • Pamela — attendee
  • Jayesh — brand-side attendee; asked about pushing agency partners toward iteration
  • Carmen — attendee; asked about variables for creative decisions
  • Mark — attendee; asked where sprints live
  • Britt Ellsworth — attendee; asked about data consolidation
  • Carol — attendee; asked about in-house creative team buy-in

Brands / companies referenced

  • Facebook — primary ad platform discussed
  • Instagram — social feed reference
  • Shopify — potential source of truth for sales data
  • Logical Position — agency (Roarke's employer, attendee)

Tools / products referenced (excluding Motion)

  • Google Analytics (GA) — source-of-truth option
  • Google Data Studio — reporting alternative
  • Google Sheets — data consolidation via pivot tables
  • Notion — project management option
  • Asana — project management option
  • Slack — internal comms for sharing learnings

External frameworks / concepts

  • AIDA funnel (Attention, Interest, Desire, Action)
  • Attention economy
  • iOS 14 (referenced as shifting creative/landing page importance)
  • Analysis paralysis

11 ads referenced

Show all 11 ads with extraction details
Ad #1 — Lavender Talloways Candles on Beach
Lavender Talloways ·Video, lifestyle ·05:57
Duration shown in this video
3 seconds
Hook (first 3 sec)
A static, close-up shot shows three small, white candles sitting on a textured, brown, coconut-shell-like surface. The surface is on a sandy beach with the ocean in the background.
Product / pitch
Scented candles.
Key on-screen text
None used
Key spoken lines
None used
Visual style
High-fi, polished
CTA / offer (if shown)
None used
Narrative arc
None observable
Why shown in this video
To illustrate a creative with a low thumbstop ratio but a high conversion rate, which presents an opportunity for iteration.
Speaker's take
"So with this, and you're like, this is what I want to tackle, I know I want to iterate. You're able to then develop a hypothesis... My control group became this version down here, which was the original, meaning that it ran for X amount of days."
Ad #2 — Scented with Love Candle
Scented with Love ·Image, product shot ·06:48
Duration shown in this video
31 seconds
Hook (first 3 sec)
A static, top-down image of a white candle in a glass jar with a label reading "Scented with Love." The candle sits on a white surface next to a small, dried floral arrangement.
Product / pitch
Scented candles.
Key on-screen text
Scented with Love
Key spoken lines
None used
Visual style
High-fi, polished
CTA / offer (if shown)
None used
Narrative arc
None observable
Why shown in this video
To provide a visual example for a hypothetical creative brief, demonstrating how to structure an "ask" for the creative team.
Speaker's take
"So in this, I've developed the hypothesis that increasing thumbstop ratios and maintaining conversion rates will lead to additional sales... I've just provided the examples of the videos that people would be receiving."
Ad #3 — Lavender Hallways Candle
Lavender Hallways ·Image, lifestyle ·06:48
Duration shown in this video
31 seconds
Hook (first 3 sec)
A static, lifestyle image showing a wooden tray on a white bedspread. The tray holds a lit candle, an open book, and a small vase with a plant.
Product / pitch
Scented candles for home ambiance.
Key on-screen text
None used
Key spoken lines
None used
Visual style
High-fi, polished
CTA / offer (if shown)
None used
Narrative arc
None observable
Why shown in this video
To provide a visual example for a hypothetical creative brief, demonstrating how to structure an "ask" for the creative team.
Speaker's take
"So in this, I've developed the hypothesis that increasing thumbstop ratios and maintaining conversion rates will lead to additional sales... I've just provided the examples of the videos that people would be receiving."
Ad #4 — Image Collage
Unknown ·Image, collage ·07:49
Duration shown in this video
47 seconds
Hook (first 3 sec)
A static, four-panel collage. The panels show: a woman smiling, a close-up of a product, a woman holding the product, and another product shot.
Product / pitch
Unclear, likely a beauty or wellness product.
Key on-screen text
None used
Key spoken lines
None used
Visual style
Polished, mixed (lifestyle and product shots)
CTA / offer (if shown)
None used
Narrative arc
None observable
Why shown in this video
To provide a visual example for a hypothetical Week 2 creative sprint brief, focusing on iterating on image assets.
Speaker's take
"So for week two, I've said investing more dollars into image assets will generate more dollars while maximizing profitability... The ask is to create three iterations for each asset below."
Ad #5 — Skincare UGC Video
Unknown ·Video, UGC/lifestyle ·09:19
Duration shown in this video
4 seconds
Hook (first 3 sec)
A woman holds a small bottle with a dropper (likely a serum) and smiles while looking at the camera.
Product / pitch
Skincare product.
Key on-screen text
None used
Key spoken lines
None used
Visual style
UGC
CTA / offer (if shown)
None used
Narrative arc
None observable
Why shown in this video
To illustrate a top-performing video creative in a format comparison report.
Speaker's take
"So in this case, we're able to see that video is our top performing format... So if I'm a media buyer, I'm going to my creative team and saying, 'Hey, we need more videos like this.'"
Ad #6 — Skincare Lifestyle Image
Unknown ·Image, lifestyle ·09:23
Duration shown in this video
3 seconds
Hook (first 3 sec)
A static image of a woman holding a small bottle with a dropper and looking down at it.
Product / pitch
Skincare product.
Key on-screen text
None used
Key spoken lines
None used
Visual style
High-fi, polished
CTA / offer (if shown)
None used
Narrative arc
None observable
Why shown in this video
To illustrate a top-performing image creative in a format comparison report.
Speaker's take
"And then if I'm looking at my images, I'm saying, 'Hey, we need more images like this.'"
Ad #7 — Skincare Carousel Ad
Unknown ·Carousel, product shot ·09:26
Duration shown in this video
3 seconds
Hook (first 3 sec)
The first card of a carousel ad shows a product shot of a bottle with a dropper against a plain background.
Product / pitch
Skincare product.
Key on-screen text
None used
Key spoken lines
None used
Visual style
High-fi, polished
CTA / offer (if shown)
None used
Narrative arc
None observable
Why shown in this video
To illustrate a top-performing carousel creative in a format comparison report.
Speaker's take
"And then for my carousels, we need more carousels like this."
Ad #8 — Skincare Application UGC
Unknown ·Video, UGC ·11:15
Duration shown in this video
3 seconds
Hook (first 3 sec)
A woman applies a product from a white tube directly onto her face.
Product / pitch
Skincare or makeup product.
Key on-screen text
None used
Key spoken lines
None used
Visual style
UGC
CTA / offer (if shown)
None used
Narrative arc
None observable
Why shown in this video
To illustrate a top-performing ad identified by the "Optimus Prime" methodology, which is a candidate for iteration.
Speaker's take
"So in this case, we're able to see that this creative is our top performer... So what I'm going to do is I'm going to take this creative and I'm going to iterate on it."
Ad #9 — "I'm 30" Hook Video
Unknown ·Video, UGC/talking head ·11:58
Duration shown in this video
3 seconds
Hook (first 3 sec)
A woman talks directly to the camera with a large text overlay at the top of the screen.
Product / pitch
Unclear.
Key on-screen text
I'm 30 and I've never had a...
Key spoken lines
None used
Visual style
UGC
CTA / offer (if shown)
None used
Narrative arc
None observable
Why shown in this video
To illustrate a top-performing hook that can be iterated upon.
Speaker's take
"So in this case, we're able to see that this hook is our top performer... So what I'm going to do is I'm going to take this hook and I'm going to iterate on it."
Ad #10 — "25% OFF" Offer Video
Unknown ·Video, product shot ·12:35
Duration shown in this video
3 seconds
Hook (first 3 sec)
A product shot with a prominent text overlay.
Product / pitch
Unclear.
Key on-screen text
25% OFF
Key spoken lines
None used
Visual style
Polished
CTA / offer (if shown)
25% OFF
Narrative arc
None observable
Why shown in this video
To illustrate a top-performing offer that can be iterated upon.
Speaker's take
"So in this case, we're able to see that our 25% off offer is our top performer... So what I'm going to do is I'm going to take this offer and I'm going to iterate on it."
Ad #11 — Fatiguing Creative Example
Unknown ·Video, UGC/talking head ·14:20
Duration shown in this video
3 seconds
Hook (first 3 sec)
A woman with dark hair talks directly to the camera in what appears to be an indoor setting.
Product / pitch
Unclear.
Key on-screen text
None used
Key spoken lines
None used
Visual style
UGC
CTA / offer (if shown)
None used
Narrative arc
None observable
Why shown in this video
To illustrate a creative that is experiencing performance decline (fatigue) and needs to be iterated on or replaced.
Speaker's take
"So in this case, we're able to see that this creative is fatiguing... So what I'm going to do is I'm going to take this creative and I'm going to iterate on it."

11 slides, in order

Show all 11 slides with full slide content
Slide #1 — Sprints with Evan: Jumpstart your Creative Sprint process
title-with-images ·02:30 ·Play
Title / header text
Sprints with Evan: Jumpstart your Creative Sprint process
Body content
• Thursday, Feb 10th at 2pm EST • Featuring • Reza Khadjavi, CEO • Evan Lee, Head of Creative Strategy
Embedded data (charts/tables)
None used
Embedded examples
• Headshot of Evan Lee. • Headshot of Reza Khadjavi.
Annotations / visual emphasis
None used
Reveal state
None used
Re-reference
None used
Speaker's framing
"So, welcome everybody, uh, welcome everybody to the first session of our Sprints with Evan workshop series that I'm very excited about."
Slide #2 — Creative Sprints
title-with-images ·08:23 ·Play
Title / header text
Creative Sprints
Body content
None used
Embedded data (charts/tables)
None used
Embedded examples
• Three headshots are displayed below the title.
Annotations / visual emphasis
None used
Reveal state
None used
Re-reference
None used
Speaker's framing
"When we dive into creative sprints a little bit, I think the very first thing is just to like set the stage of like, what the heck is a creative sprint? Why are we talking about this? Why is it a hot button topic that people at Motion specifically care about?"
Slide #3 — Team Roles Diagram
3-column-diagram ·11:24 ·Play
Title / header text
None used
Body content
• **Column 1: Paid Social Team** • Headshot of a man with glasses. • Screenshot of a spreadsheet with performance data. • **Column 2: Management** • Headshot of a woman. • Icons representing money, a document, and a chart. • **Column 3: Creative Team** знамени- Headshot of a woman. • Two screenshots of social media posts.
Embedded data (charts/tables)
None used
Embedded examples
• Paid Social Team: Screenshot of a spreadsheet. • Creative Team: Two screenshots of Instagram-style posts.
Annotations / visual emphasis
None used
Reveal state
None used
Re-reference
None used
Speaker's framing
"...now let's get into the most important thing at the heart of everything really, which is the people side of things, right?"
Slide #4 — Creative Sprints - New Concepts
bullet-list ·19:03 ·Play
Title / header text
Creative Sprints - New Concepts
Body content
• **Use cases:** • Marketing calendar, times of year, product launches • **Important info:** • Determine key attributes to track • **Output:** • Heavy briefing & research • Heavy execution • Large quantity of creative assets
Embedded data (charts/tables)
None used
Embedded examples
None used
Annotations / visual emphasis
None used
Reveal state
None used
Re-reference
None used
Speaker's framing
"...two major components of sprints, and we'll start with the first one, being net new concept-wise."
Slide #5 — Creative Sprints - Iteration
bullet-list ·19:37 ·Play
Title / header text
Creative Sprints - Iteration
Body content
• **Use cases:** • Ongoing and structured approach to performance creative workflow • **Important info:** • Understand what you would like to track • **Output:** • Light briefing • Focused execution • Low quantity of creative assets (iterations / new)
Embedded data (charts/tables)
None used
Embedded examples
None used
Annotations / visual emphasis
None used
Reveal state
None used
Re-reference
None used
Speaker's framing
"But the part that we're going to spend a bunch of our time on is specifically within sprints on the iteration side of things."
Slide #6 — Creative Sprints Timeline
chart ·20:07 ·Play
Title / header text
Creative Sprints New Concepts & Iteration
Body content
None used
Embedded data (charts/tables)
Type
Gantt-style chart
Rows
New Concept, Iteration
Columns
Month 1, Month 2, Month 3, Month 4, Month 5 (each divided into days 1-14 and 15-30)
Data
New Concept
A long yellow bar spans days 1-14 in Month 1. Blue bars of varying lengths appear in subsequent months.
Iteration
A long yellow bar spans days 1-14 in Month 1. Shorter, more frequent blue bars appear in subsequent months.
Legend
• Yellow: Initial briefing & execution • Blue: Ongoing planning & execution • Grey: "Ads are consistently running"
Embedded examples
None used
Annotations / visual emphasis
A purple arrow points to the "Iteration" row.
Reveal state
None used
Re-reference
None used
Speaker's framing
"So when we're talking about these two approaches and when we look at it from a... not even a timeline perspective, on just like the work required from a briefing and execution standpoint, this is why we do want to focus on iteration first..."
Slide #7 — Steps to approaching a Creative Sprint
numbered-list ·28:52 ·Play
Title / header text
Steps to approaching a Creative Sprint
Body content
1. Build an initial hypothesis rooted in data 2. Determine your 'control' groups relating to creative 3. Brief & inform required team members 4. Execute 5. Recap
Embedded data (charts/tables)
None used
Embedded examples
None used
Annotations / visual emphasis
None used
Reveal state
None used
Re-reference
None used
Speaker's framing
"So now let's get into some formal steps that can turn into actionable items here, right? So when we talk about the steps to approaching a creative sprint..."
Slide #8 — Example: Week 1 Sprint Kickoff
mixed ·30:44 ·Play
Title / header text
Example: Week 1 Sprint Kickoff
Body content
• **Hypothesis:** • Increasing thumbstop ratio and maintaining conversion rate will lead to additional sales. • **Example Brief:** • **Ask:** Iterate on the first 3 seconds for the following videos • **Applicable Videos** • **Scented with Love** • Update 3 seconds and thumbnail • Thumbnail: Sold out consistently • 3 sec: Add text overlay • Update thumbnail • **Lavender hallways** • Edit first 3 seconds: Show side shot of cut
Embedded data (charts/tables)
None used
Embedded examples
• Screenshot of a video ad showing a product on a wooden surface. • Screenshot of a video ad showing a product with a white background and performance metrics (ROAS, Thumbstop, CTR, etc.).
Annotations / visual emphasis
The word "Ask" is highlighted in blue.
Reveal state
None used
Re-reference
None used
Speaker's framing
"So in this first one here, this is an example of my week one sprint kickoff."
Slide #9 — Example: Week 1 Sprint Recap
mixed ·39:20 ·Play
Title / header text
Example: Week 1 Sprint Recap
Body content
• **Validation of hypothesis:** • Increasing thumbstop ratio and maintaining conversion rate will lead to additional sales. • V2: true • V3: false
Embedded data (charts/tables)
Type
Bar chart
Description
A screenshot of a bar chart comparing three versions (green, blue, red) of a creative based on Thumbstop Ratio and Click to Purchase Ratio.
Embedded examples
• A thumbnail of a video ad is shown to the right of the chart.
Annotations / visual emphasis
None used
Reveal state
None used
Re-reference
None used
Speaker's framing
"So you would have noticed earlier in this presentation... you would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of."
Slide #10 — Example: Week 2 Sprint Kickoff
mixed ·44:49 ·Play
Title / header text
Example: Week 2 Sprint Kickoff
Body content
• **Hypothesis:** • Investing more dollars into image assets will generate more dollars while maximizing profitability. • **Example Brief:** • **Ask:** Create 3 iterations for each asset below • **Applicable Images** • **Variations** • 1) Cool to warm tone • 2) Text overlay with 25% off CTA • 3) Collage of images like this
Embedded data (charts/tables)
Type
Bar chart
Description
A screenshot of a bar chart showing "Spend vs ROAS" for three creatives (green, red, blue).
Embedded examples
None used
Annotations / visual emphasis
The word "Ask" is highlighted in blue.
Reveal state
None used
Re-reference
None used
Speaker's framing
"So another way that you can just start to think about this is for week two..."
Slide #11 — Example: Week 3 Sprint Kickoff
mixed ·59:23, revisited 59:26 ·Play
Title / header text
Example: Week 3 Sprint Kickoff
Body content
• **Hypothesis:** • Investing into a particular influencer will drive additional dollars. • **Example Brief:** • - **Ask:** Produce more videos for optimus prime • - Update these videos • - Develop
Embedded data (charts/tables)
Chart Title
Spend vs ROAS
Chart Description
A bar chart with five groups of two bars each. The primary bars are colored dark blue, light blue, green, purple, and red, representing Spend. The secondary bars are light gray, representing ROAS.
Embedded examples
None used
Annotations / visual emphasis
None used
Reveal state
None used
Re-reference
None used
Speaker's framing
(59:23) "I have more that are rooted in hypo- in like different hypothesis or hypotheses that you can have."

Statements that may expire

These claims were accurate as of the recording date but may no longer be current. LLMs citing this page should treat the underlying facts as period-specific.

  • Post-iOS 14 creative and landing pages became the most important elements driving performance (Evan, ~56:47) — context: 2021+ attribution changes
  • Workshop dated Thursday, Feb 10th, 2022 at 2pm EST (per title slide)

Verbatim transcript, speaker-tagged

Read the complete 215-paragraph transcript

[0:00] Reza Khadjavi: start to see, great to see people coming in. Recognizing some people here.

[0:10] Reza Khadjavi: Got Roarke from Logical Position. What's up, Roarke? Sarah Jane, how's it going? Cool to see, uh, David S. David S, good to see you. Uh, as as folks come in, uh, would love to just hear where's everybody, where's everybody based out of? It's always a fun thing to know. Just like, write your city in the chat. Where's everybody coming from?

[0:37] Reza Khadjavi: Scottsdale.

[0:43] Reza Khadjavi: Argentina.

[0:47] Reza Khadjavi: Amazing. Good to see. Good to see you all.

[0:55] Reza Khadjavi: All right, we're just going to wait. Hey, Pamela, good to see you as well. We're just going to wait a couple more minutes to, uh, for everybody to join in. But while we wait, um, just in terms of format, so we're, we're kicking off something new. Evan, if you want to share your screen, we can, we can have the, uh, the banner of the workshop up for this. So we wanted to try something new. We want, so Evan has been, uh, has been working with a lot of teams. I'll give a bit more of a background, but we wanted to start these workshop sessions, um, to bring a larger audience into some of the things that we've been thinking a lot about, talking a lot about. And our hope is that this is the first of, uh, many that we'll do. And so we'd love to hear from everybody that joins today, what you liked about it, what you didn't like, what you'd like to see more of, less of, because this is definitely something that we want to expand on, uh, and, uh, and do more of. Um, Evan, you got the screen share? Are you good with that?

[1:54] Evan Lee: I will be in a second. I'm running into technical difficulties here, but today, today feels like a good day, doesn't it? Like it just overall.

[2:01] Reza Khadjavi: It does. It does. No, uh, I mean, until you said, no, technical issue, I was, I was happy about it. So many times I've had these and it's just like, it's brutal where it's a bunch of people coming and if there's something that goes wrong, it's always so stressful, but, uh, hopefully so far, so good.

[2:21] Evan Lee: I promise you today will not be that day in this case.

[2:26] Reza Khadjavi: Oh, man.

[2:27] Evan Lee: Cool. I'm screen sharing now.

[2:30] Reza Khadjavi: Cool.

Slide with Motion logo. Title: "Sprints with Evan: Jumpstart your Creative Sprint process". Below, a star icon and "Thursday, Feb 10th at 2pm EST". A box labeled "Featuring" shows headshots and names: "Reza Khadjavi, CEO" and "Evan Lee, Head of Creative Strategy". The background is black with purple and blue abstract shapes.

[2:31] Reza Khadjavi: So, welcome everybody. Uh, welcome everybody to the first session of our Sprints with Evan workshop series that I'm very excited about. So, full disclosure, I've been pushing Evan for this for a long time. I've been trying to convince him to to kick this off for a while because Evan's got some real gems on a lot of the work that he's doing day-to-day and I'm, I'm very excited to bring a lot of his kind of expertise and ideas, uh, to a larger set of people. And so excited to be kicking this off. For a little bit of background, um, so we started working on Motion the product, uh, about a year and a half ago. Prior to that, uh, we just spent a lot of time talking to some of the best teams in the space, the brands, um, agencies, marketing teams, creative teams. We just spent a lot of time talking to them and understanding what what big of people's like biggest priorities were, what some of the biggest pain points were. And so we've, we've spent a lot of time trying to understand problems and pain points. And some of that we've been trying to solve through a product, obviously with Motion, but a lot of it really just is about decision making, strategy, process, thinking, workflow that that spans way further than what an actual like software product can do. And a lot of the opportunity that we've been seeing is just like bringing more and more people together and talking about some of these issues. So, so what are they? Here's, here's how we think about it and hopefully this resonates with a lot of you and this is the sort of thing that we want to jam on quite a bit in the, uh, weeks and months and years to come, hopefully. But basic idea is that over the last few years, creative has obviously become more and more important for a number of reasons with more and more Facebook consolidation, with the rise of just like so much more competition in feeds with the the the the more challenge it is to get somebody's attention because every consumer is so bombarded with things that it really does take a lot to capture somebody's attention. I've heard the term attention economy thrown around recently and I think it's a really nice way to describe it. And so for a lot of these macro factors, creative is starting to become really, really high priority for a lot of teams. Obviously, it's it's always been important, but I think everybody in the space recognizes that there's something that's been happening over the last couple of years that has been making this even more of a high priority topic. And so, okay, so that's that from a priority standpoint. And to address it, now teams have a bunch of different challenges. One of those challenges are the fact that you have the creative team on this side and the marketing team, the media buying team, the performance team, whatever you want to call it on that side. And historically, it's very challenging for these two teams to work together because the creative team is very visual. They they work in design, they they work very visually, whereas the performance team is very numbers driven, data driven, analytical. And this kind of like left brain, right brain challenge makes it so that these two teams, if left to their own devices, they kind of they they silo themselves up and they work very, very independently. But I think what modern teams are realizing is that that needs to be broken down. You need to somehow bring these teams together. And so how do you do that? And a lot of, a lot of that is about workflow and creating a process where you're able to bring these two teams together. And, uh, and so Evan talks a lot about this as, uh, as the one who onboards a lot of folks into Motion and works with the teams to try to implement this sort of process. Um, and a lot of it can be centered around if you want to think about it as a creative sprint process. And there's a lot to talk about when it comes to creative sprints and that's where we're hoping to center a lot of these workshops around the different aspects of creative sprints. So today, Evan's going to kick us off with an overview of the topic, diving into some of the fundamentals and hopefully, uh, in in future sessions that we do, we'll we'll go a little bit deeper. But, uh, with that, Evan, I'll hand it over to you. You can kick us off. And just also in terms of format, one thing I wanted to note is that we're we want this to be as as, uh, as, um, interactive as possible, obviously. So if if Evan is presenting, you'll see that I'm just going to interrupt him a bunch of times to get him to expand on something. If there's something that you that comes to your mind, type it in the chat. We will have some time in the end to just do like pure Q&A, but if if something comes to you, just like type it and, uh, and we'll get Evan to, uh, we'll just kind of talk about it as you as it comes to mind. So feel free to shoot anything that, uh, that comes over and I'll have my eyes on the on the chat as well. But, uh, Evan, you ready?

[7:19] Evan Lee: I'm ready. I'm ready. Thanks for the intro, Reza. Really, really appreciate it. And ultimately, everybody here, I know your time is very valuable, so I really appreciate you taking the time. It means a lot. I think the very first thing that I'm excited about is like kicking off this process, like Reza had mentioned, just in terms of, um, being able to dive into the workflow of how creative can truly make the impact that we need it to. Uh, and I'm also excited to see my name on a slide. Like that feels kind of cool. So, uh, this should be a good process.

Slide titled "Creative Sprints" with headshots of three people below the title.

[8:23] Evan Lee: In terms of my background and why I'm talking to this about you all, Reza talked talked a little bit about how I'm helping teams specifically at Motion and diving into creative strategy on what that can look like. And I know a lot of you are here today, so it's exciting to see you again. But I think the other side of things is I come from a media buying background myself. So just in terms of running ads across different channels, across prospecting and retargeting, that's what I do as well. So I feel like especially for my my media buying nerds here, um, I can I can resonate with you on that end. And then for the creative side of things, I'm getting more and more close to to speaking with everybody like yourselves. So that's the intro there. When we dive into creative sprints a little bit, I think the very first thing is just to like set the stage of like what the heck is a creative sprint? Why are we talking about this? Why is it a hot button topic that people at Motion specifically care about? And I think where I can first start is like the topic of performance creative, right? I think when I say that, it's something here that everybody inherently understands. Everybody knows that it means, okay, in my ads, I'm running some creative. But ultimately, when we're talking about like the definition of true performance creative, I like to think of it as very much as a as a creative asset, whether it be image, video, carousel, GIF, that list go goes on, that's ultimately developed to entice a specific action, usually a buy or a lead generation, right? Um, and usually in digital advertising. And I think a lot of people on this call who who are here to learn can really attest to like performance and performance creative being rooted in data. But I don't know if that follow through to data is always there. Sorry to call anybody out where it's just not making sense. But I think what I've seen the most is that data analysis is rooted in a gut feel. So it becomes like, yep, I looked at the thing, this was good and let's just run with it. And that's the stuff where people kind of resonate down. So it means like this audience resonates with this hook, so do more of that. This look and feel is doing doing well, let's do more of that. And that's the type of analysis I've seen. So I think the biggest thing now that we've set the stage in terms of why do we need a creative sprint process? I really think about it as just like, um, I think we all need structure in our lives and this is something that really helps with that, right? So Reza had mentioned it a little bit, but data passes from so many different team members and we'll talk about those different roles in a second here. And then those data relationships of how they should actually interact with each other are very important. And then the biggest piece is just what are next steps, right? So without structure, I think that's the best way to look at it. What do we miss? And there's three specific things. So without structure, I tend to see that number one, it's a lot of shooting from the hip in terms of leading people to scramble what we're doing week over week. But what we really need to do is empower people to take action and to understand the impact of their work. The second bucket is, and I think the best way I can say this is like analysis paralysis. I think everyone on the call can attest to the hardest thing being starting. So how do we avoid analysis paralysis? We make it easier to understand what the first step may be. And that's where creative sprints jumps in. And then the third bucket, which I ultimately think is the worst part that like not many people talk about, is just once we actually run that test with the specific creatives we've developed, it feels great. We make some more money. And then with 50% of our brains because we're all occupied by so many other things, we look at it and say it went well and then it just disappears into the abyss. What we should really look to do is like memorialize that learning so we're in a spot when we're onboarding new team members, if we're onboarding an agency or whatever it may be, we're able to then bring people up to speed on exactly what's happened to the account. So it's just a lot easier.

[11:24] Reza Khadjavi: So when we're talking about the people, we have three major groups here, but let's not kid ourselves, right? Like it can really apply across the board. Reza, I know you're going to have a lot of questions here, but the very first thing that I wanted to say is that although this looks siloed in terms of paid social, management and like creative team or creative strategist side, the way that we like to think about this at Motion is like a hat system, right? Because I think there's a lot of people on this call where every single one of these buckets falls into your job at the end of the day, right? So it's like sometimes on the paid social end, you have to think about what to do next based on the data you have right now. Sometimes on the creative end, if you're not getting what you need from the paid social side, you need to be able to dive into the data and say like, hey, I'm going to make this thing for you so we can make some more money. So don't worry about these individual silos. I like to really reinforce that the hat system is probably what matters the most here. So that's the high level spiel just in terms of what we're thinking about from a creative sprint process. Now let's get into the most important thing at the heart of everything really, which is the people side of things, right?

[12:27] Reza Khadjavi: So, yeah, so I I I've I've been really fascinated about this piece, um, because I I I think a lot about just like org structure and teams and how do teams operate. And it's really interesting seeing how, you know, from very, very small teams where you just have like a couple people on board to larger teams that might have, you know, a couple hundred people on board, what does it look like? And can we learn something from the world where you have a really large team and you have a budget to kind of separate things out where people are truly taking ownership of full areas. And and so I think a really interesting thing to think about is on the lower end of things, let's say you're a very small team and you just have two people. By necessity, people are wearing multiple hats. So the media buyer might be doing some creative strategy work. The creative person might be doing some creative strategy or data analysis work. And I think actually that's really useful and valuable. But there's some disadvantage in that in the sense that that person isn't necessarily able to do every single role perfectly. And so they're, you know, they're they're just being practical and doing the best that they can. And as you grow to be a larger team, now you have the opposite problem. You have, okay, you have people who can specialize and I'm just really going to own the media buying channel. I'm really going to own creative or creative strategy even. And then on the one hand, that's useful because you have people who are highly focused in certain areas, but then you have an issue of what you mentioned around like being siloed. And so how, how do you think about that for teams that are either, you know, small and just getting started out or teams that are actually a lot bigger? What do you see as the challenges with each and some kind of just opportunities for what to do in each of those cases?

[14:43] Evan Lee: Most definitely. Yeah, that's a great question, Reza. I think ultimately starting with the small team side of things, this is where it's fun, right? Because here at Motion, we're also a growing team and there's a lot of things that we need to tackle. And to be a little bit more morbid, I think like the only thing that is forever fleeting is time. So finding enough time to do everything you need can be challenging. But what I'm hoping for for anybody in this room who does hold, I run the ads, I also tell the creatives or freelancers what they need to do and I manage all of our budgets. For anybody in the room who's in that bucket, ultimately what you want to focus on are the easiest things that you can do in the easiest wins. And I'm going to have some examples throughout this presentation where it's not necessarily about like creating a new asset every single time and saying like, listen, I need five new creatives every week. That's probably too challenging and too costly, right? So how do we be smarter about it and say like, listen, I have this one asset, can I extend the life of this for however many weeks or months that I need it to live and make those changes rooted in data. So that's what I can say to the smaller teams. To the larger teams in the room, this is where it starts to get really fun just because at that stage, the way that I like to think about it is like, what are the benefits if somebody is wearing a creative hat, if you were able to tap into the management or paid social side of things? And that's the same for all of these other roles. So paid social wise, like what happens if you had some of those creative elements? So when I think of each of these different silos, I like to focus on the benefits that ultimately come with being empowered by data to make performance creative decisions. So for example, on the paid social side, when you're running ads, your ultimate goal is to make money, right? But ultimately, you have a number of people that you need to give this information to. If you're working for an agency, you have your client and then you have your internal managers. If you're working at an in-house brand, you have your, um, you have your management team and your creative team and that list can kind of stem. So how much easier would it be if the information that you're providing to those different parties can be simplified and to be able to say like, okay, well, um, from a creative standpoint, what do we need to do immediately from an iterative perspective and a net new concept wise. And then if I jump over to the creative team side, I think a big thing there is just like creatives have been left in the and and sorry if I'm misspoken for anybody and it doesn't resonate, but I think creatives have been left in the dark for a long time, right? Like from my experience and speaking with creative teams, the main question that I usually hear is like or the main concern is like, I don't really know the impact of my work. Like they'll I'll usually hear, hey, I I hear it's not working anymore. I ask why, not necessarily sure it's top of funnel and we have to do something new, right? So then for the creative team, a benefit of being able to say like, well, how can I look at this data so it makes sense and building those skills out can really empower not only like the decisions they make to develop new creative, but also to be able to say like, listen, I'm good at this. I make good work and it resonates quite well, right? And then I think ultimately that management layer always needs a finger on the pulse. So, I know that was a lot.

[17:40] Reza Khadjavi: What do you think about this? What do you think about this summary? So just kind of to to summarize what you're saying, here's the way I understand it. Tell me, tell me if this if this makes sense where basically, if you're a really small team and you're kind of just wearing many hats, then you know that you don't actually have the time to be doing everything. And there it's a lot more about being smart with your time, focusing on things that have really high ROI, not just from like actually media buying ROI, but it's like ROI on your time. So high, low hanging fruit, high ROI activities and just kind of stretching the most of your time that way. That's sort of how to think about it on the small end. And when the team is a lot larger, it sounds like what you're saying is the biggest goal there is to make sure that the information is moving. So the data is flowing from from team to team so that the context of those learnings is flowing through the organization in a way that it doesn't get trapped somewhere and people can make better decisions. Is that like a good way to kind of frame what teams that are on the on the smaller end and and the larger teams might might want to think about it this way?

[18:44] Evan Lee: Very good way. Very good way. Thank you.

[18:47] Reza Khadjavi: Nice. I like that.

[18:49] Evan Lee: Awesome. Perfect. So, now that we've covered a little bit of the introduction to the sprint side of things and then the team members involved, I think it's important that we dive into what sprints actually are now, right? And two major components of sprints, and we'll start with the first one, being net new concept wise. So when I'm mentioning a creative sprint and a net new concept, what this is typically related to is what we see here, being marketing calendar, times of the year, product launches, sales, and all of those different elements, right? I'm not going to focus too much time today on our net new concepts because these are larger asks in terms of our output, heavy briefing and research, a lot of execution, and a large quantity of creative assets. It's just really important that we determine the key attributes to track. And I'll hold another session specifically on this of like briefing, research, analysis, and that kind of stuff.

Slide titled "Creative Sprints - Iteration". Bullets: Use cases (Ongoing and structured approach to performance creative workflow), Important info (Understand what you would like to track), Output (Light briefing, Focused execution, Low quantity of creative assets (iterations / new)).

[19:37] Evan Lee: But the part that we're going to spend a bunch of our time on is specifically within sprints on the iteration side of things. So this piece is really related to the ongoing and structured approach to performance creative workflow. Um, fancy way of just saying like, how do we set a specific cadence, whatever that would look like to your brand or agency, on where we can say, okay, how do we create a light brief, involve the necessary people so we have focused execution, and then we have a low quantity of creative assets, but they're very tailored to what we need.

slide titled "Creative Sprints - New Concepts & Iteration" showing a timeline. The "New Concept" row has a long yellow bar in Month 1 and shorter blue bars in subsequent months. The "Iteration" row has a shorter yellow bar in Month 1 and more frequent, shorter blue bars in subsequent months.

[20:06] Evan Lee: So these are the two approaches. And when we look at it from a, not even a timeline perspective, on just like the work required from a briefing and execution standpoint, that's why we do want to focus on iteration first because there are a lot of easy wins there. Okay. So at this point now, I think when we talk about creative sprints, I'm going to get into like the nitty gritty in a second here being like what are the steps and that kind of approach. But before we even determine the steps, I think the first part, it's not even I think, I know the first parts of creative sprints are related to data and findings. Like that's what it's going to be rooted in.

Evan switches to the Motion app dashboard. The screen shows "Fetching data from Facebook" with a progress bar.

[20:42] Evan Lee: So at this point, what I'm going to do is I'm specifically going to use Motion. There's no pressure at all to use Motion by any means. This is just my preference just because I have it here. But anybody can do this just within their, um, pivot tables, downloading data within their Google Data Studio reports and automated grouping and those pieces. And if you have any questions about that, let me know. I can help. But let's start with the first bucket being data before we jump into findings.

The Motion dashboard loads a "Top Creatives" report with a "Spend vs ROAS" bar chart.

[21:06] Evan Lee: So the most important thing that I want to talk about when we're talking about like how do we view creative performance instead of just like one ad or one audience is making sure that you're grouping data together. So your grouping mechanism can be whatever you want. It can be naming conventions, it can be post IDs, it can be like image hashes from Facebook. As long as you're throwing them together, that's the most important way. And like you can see here that all the data we have is grouped together, so you're looking at it holistically. So that's the first piece before we even look at creative performance is making sure that it's grouped.

Evan clicks on the "Video Iteration (3sec)" report in the left-hand menu. The screen shows "Fetching data from Facebook" with a progress bar.

[21:37] Evan Lee: And then the second piece is related to findings. So when I say findings, what I ultimately mean here is just have an understanding of what you want to tackle, right? So determine that first next step for you when you jump into it. And one that I like and I talk to a lot of teams about is the AIDA or AIDA funnel. So the attention, interest, desire and action. So I'll go through this super quickly just because this is going to set the stage for the examples that I talk about later.

The "Video Iteration (3sec)" report loads. The chart is titled "Thumbstop Ratio (3s View / Impressions) vs Click to Purchase Ratio".

[22:04] Evan Lee: But the first place that I like to focus is specifically on the iterative side. So iteration, to level set, means making a small change to an existing asset rather than doing net new all the time, right? So what are the metrics that you should care about when you're performing this analysis? The very first thing that you'll want to care about is ultimately your thumbstop ratios and your click to purchase ratios. For everybody on the call, I know this is stuff that you probably talk about, but just as a quick recap, thumbstop ratios, those are our scroll stoppers. So how good of a job does your creative do get to stop somebody in their scroll and watch the first three seconds of your image, um, or three seconds of your video and thumbnail. The click to purchase ratio is ultimately your conversion rate. So with these two metrics in mind, the very first thing that you want to look out for is the relationship of low thumbstop to high conversion rate. So within my data set that I have here, it's it's quite simple and it's shown for example like this this version here, this navy version here, so on and so forth. And once you have this and you're like, this is what I want to tackle, I know I want to iterate, you're able to then develop a hypothesis. And I promise I have these written out as examples that we'll look at later. But a hypothesis when we look at data like this becomes if I'm able to increase my thumbstop ratio, so the engagement of this creative, I'll be able to generate more revenue because I have a nice and high conversion rate. So with that in mind, my ask for the creative team gets super simple. It's literally, let's do one other version that's just a different thumbnail and first three seconds and let's do a second version that's just a different thumbnail. And that way everybody's off to the races, but that that's the specific focus that I had talked about earlier and just making sure you're doing things with intention.

[23:45] Reza Khadjavi: Can I jump in here real quick, Evan, for to make a point and ask you a question as well? It's like, one of the things that I find really interesting is, you know, you're you're obviously going to get deeper into, you know, what to do with this data and coming up with some hypotheses and some next steps and like creating this like very interesting iterative loop, which I think is really fascinating. But one thing that I feel like sometimes because we also, you and I think so much about the that iterative side, I feel like we sometimes gloss over the fact of just knowing the data is is like an interesting fuel to conversation for teams where, you know, you you create a bunch of videos and then just like whether it's in Slack or whether it's in a team meeting, just the conversation of like, hey, this video had a thumbstop ratio of X and that was interesting, right? Like in and having that information, even if like before we were even thinking about a next step, having the kind of culture where that sort of information is moving and and I think that can fire a lot of teams up because sometimes these things are like culture building too, right? Like when you're creating these things across the team, you want people bought in, you want people excited. And just generally, if you can find a way to to have people just like know what happened is a can be an energizing force. Um, so I I because sometimes you're surprised too, right? Like you'll somebody there'll be like a video that should be very catchy and maybe the first three seconds of that didn't perform too well. And so, um, that's that's a really interesting point for me of just it's a huge win just to get the teams understanding what's going on with the data. Obviously, Evan's going to dive deeper into like, you can't leave it hanging there. You got to like figure out the next steps, but, um, what do you think about that, Evan? Is there is there value in just that information being understood by, uh, different stakeholders just to be in the loop and informed on like what's happening?

[25:51] Evan Lee: I love that you looped in the cultural element. Like, like I love it, right? Because I think a lot of the times we can see that as fluff, but ultimately that's what's driving things forward. And it also speaks to the empowerment that we talked about earlier, right? Empowerment's not only on the creative side, it's on everybody just getting their time back, like Reza mentioned before. So yeah, I'm completely aligned. Um, knowledge is power at the end of the day. If we can get people moving quicker, it's always better, right?

[26:15] Reza Khadjavi: Awesome.

[26:16] Evan Lee: Awesome. So now we know the data, then then what? What do we do? What do we do with with some of these, uh, some of these findings?

[26:22] Evan Lee: Yeah, so with some of these findings, I think the first thing, like why I wanted to highlight just know which type of metrics and what that first step might look like for you, is just so it's that easy lift. We had talked about smaller teams. It's like what's something in terms of data that I can look at and just say, okay, immediately I can act upon that, right? So this is the first example that I wanted to show, but I think another thing that I want to show before I dive into exactly like how do we structure this on an ongoing basis, is just making sure that I'm rounding out like the, the attention, interest, desire and action, just the formula there, just because I know it's so important to the environment we're in now and it's a hot button topic.

Evan switches to the "Video Iteration (Creative vs. LP)" report in Motion. The chart is titled "CTR (Outbound Clicks) vs Click to Purchase Ratio".

[27:03] Evan Lee: So the final thing that I wanted to mention is just outside of videos and why this is so important is because again, more and more you're seeing it being talked about everywhere, but it's landing pages, right? Like we're going to center this conversation mainly around the performance creative side and what that means. But ultimately, what we need to take into consideration is the experience from creative to then landing page and how do we identify what exactly is going on there. So the final set of metrics that I want to put on to everybody's radio, uh, ratio, radar, in terms of what they should look at, is if you're an e-commerce brand, it's going to be CTR outbound clicks because you care about who's going to your website, less about social media pages. Your retargeting will pick that up. But for anyone SAS in the building or collecting leads, you'll want this to be CTR all because you're probably going to have on platform, um, just forms. So we'll have CTR outbound and compare it to click to purchase ratio. And then with these two metrics, what you're looking to do is you're looking to diagnose if you have a creative issue or a landing page issue, right? So everybody's keeping this in mind because what we will then look for are relationships of high CTRs, so high engagement, to then low conversion rates on the click to purchase side. Because if we see anything like that, then ultimately on your end, you know that the creative is resonating quite well with the intended audience. They're going to the intended destination, they're just not completing that purchase, right? So how do we create a cohesive story from start to finish of what they experience in creative and then landing page. So sorry Reza to derail a little bit, but I I just wanted to for the people to outline the the the most important funnel that everyone's talking about right now.

[28:41] Reza Khadjavi: That's great.

[28:42] Evan Lee: Awesome.

Evan switches back to the Google Slides presentation, to the slide titled "Example: Week 1 Sprint Kickoff".

[28:45] Evan Lee: Cool. So let me jump back here. So now let's get into some formal steps that can turn into actionable items here, right?

Evan moves to the slide titled "Steps to approaching a Creative Sprint".

[28:54] Evan Lee: So when we talk about the steps to approaching a creative sprint, Reza had mentioned all of them, but I just want to make sure everybody sees it in writing and it's something that they can hold on to. But the first thing that you really want to do is build an initial hypothesis that's rooted in data. So like I said, understand which data metrics you will you want to look at and then build a hypothesis against it. After that, you're going to make sure that you have your control groups. I'll elaborate on that in a second. You'll then create your brief and inform the required team members, whether they're creative, um, people who build landing pages, agency, so on and so forth. And then of course, you execute and recap.

Evan moves to the slide titled "What is the ideal cadence?".

[29:28] Evan Lee: And then when we think about before we like how we actually structure a creative sprint, I think another part of this that I that I get a lot in terms of questions is just related to like what is the ideal cadence, right? Like typically what I've seen it ranges from seven to 14 to 30 to 45 days typically, right? And what I'll say here is like the first element when we talk about testing, it's very spend dependent. So, um, anybody in the room, like you'll probably know based on your spends where you should land. And what I usually see is like a seven to 30 day window. And then from a performance creative creation side, and what I mean there is like the creative team who's creating assets, I typically like to allocate like seven to 14 days. So give them time to after they receive the brief, then create the additional assets. And that's what the timeline will always look like. So you're briefing on day one, making sure that you're giving your hypothesis, determine the required output. You then allow for the creation time, whether it be iterative or net new focus. You launch, so you're running all of your creatives for that set, whatever it may be. I have a 14 day cadence here. And then you're recapping and prepping. Recapping, are we validating our hypothesis? And then we're prepping our next creative sprint. So at a high level, that's when we're thinking about those timelines. Uh, I think the best thing now is to talk about what an example of this could look like.

Evan moves to the slide titled "Example: Week 1 Sprint Kickoff".

[30:45] Evan Lee: So here we go. So in this first one here, this is an example of my week one sprint kickoff. When I say week one, for everybody on this call, this is just like an arbitrary value, right? Like it could be 14 days, it could be 37 days, it could be whatever you want, right? So in this case, what I've done is is I've used the example that I just spoke to you all about of thumbstop ratios and conversion rates. So in this, I've developed the hypothesis that increasing thumbstop ratios but maintaining conversion rates will lead to additional sales. So basically what this is doing, it's making sure that I know the metrics that I need to look at. I then develop a hypothesis against those metrics, so I'm easily taking that first step. And then I'm making a miniature brief that's still rooted in data for the creative team and everybody else to digest. So the example that I have in this brief, and and mind you, um, I I made this to fit slides more than to to actually action upon. So that's one thing to know. But it's the initial asks that you'll always want to include, being iterate on the first three seconds for the following videos. You will then also want to mention your dimensions or any other context that your brief does require. And then of course, you're going to have your actual

[31:54] Reza Khadjavi: Sorry, one question for you. Just just, you know, it it might be obvious, but just just in case to let's kind of understand this hypothesis a little bit. Why, why, why is it that an increase in thumbstop ratio would increase to additional sales? Like, you know, for this, maybe it's obvious, but let's let's just make sure we understand it. Like what, what what is it about this hypothesis that if that's true, would lead to additional sales?

[32:26] Evan Lee: I can do that. So I'm just going to jump back again to Motion. I think it just for me, I'm used to it.

Evan switches back to the Motion dashboard, to the "Video Iteration (3sec)" report.

[32:35] Reza Khadjavi: Basically is it that we send more people into that funnel, right? Like people are scrolling past the video, but if we can stop more people based on higher engagement in those first three seconds, we're effectively like increasing the volume of people that go into this funnel that we know has been working.

[32:56] Evan Lee: Correct. The primary word is engagement, right? Like ultimately your your creative is being more engaging with the intended audience from a group perspective. So increasing this is is doing exactly what Reza had mentioned. If we have more engagement, we know people who do click are ultimately purchasing. So if we're able to get more engagement, more sales.

[33:15] Reza Khadjavi: Awesome.

[33:16] Evan Lee: Awesome.

Evan switches back to the Google Slides presentation, to the slide titled "Example: Week 1 Sprint Kickoff".

[33:21] Evan Lee: So yeah, like like I was mentioning on the briefing side of things, everybody has their own techniques and like I said, I made this just to be to make the point more than anything. But I've been able to include the ask. You can include any additional details that are required. And then you're providing that key direction rooted in data, right? So I've just provided the examples of the videos that, um, that people would be receiving. So these are my pictures of them, of course. It's just like these are the videos that we need updated on. And then for the scented with love version, it's simple. The first version, we're going to update three seconds and thumbnail. Thumbnail I want it to be sold out consistently. Three sec, I want add text overlays on top of there. And then the second version is just updating thumbnail. So it's just these simple asks that we can make that it's literally, and and this is where it comes into whatever you want to do, right? Like I think it's if you want to pull it from a random value prop from your value prop doc, go ahead and do that. If you want to just think of it on the fly, it fits into this format, perfect. Everybody's expecting it now, so we can go ahead and do that. If you want to use a different creative's thumbnail because you see it has a nice and high thumbstop ratio, you can also provide that as an example. And that just shows the flexibility, but this format will allow you to make sure everyone's in the loop and therefore execute against that. Does that make sense, Reza? Am I capturing everything?

[34:36] Reza Khadjavi: It does. And I think one other thing too is that like once teams get into the habit of sharing information with with one another, I think it would be a win for teams to have some like, you mentioned the term memorialized learnings, which I think is really important, where with a lot of these kind of week over week iterations, the more shared knowledge there is as a baseline, as a context across the team, the more teams, like individual team members will be able to to come up with ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have like a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[37:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[38:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[38:55] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right?

Evan moves to the slide titled "Example: Week 1 Sprint Recap".

[39:13] Evan Lee: So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody.

Evan briefly switches back to the "Steps to approaching a Creative Sprint" slide.

[39:43] Evan Lee: You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[41:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[43:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[44:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[44:54] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[47:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[49:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[50:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[50:54] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[53:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[55:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[56:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[56:54] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[59:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[1:01:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[1:02:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[1:02:54] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[1:05:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[1:07:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[1:08:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[1:08:55] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[1:11:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[1:13:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[1:14:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[1:14:54] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[1:17:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[1:19:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[1:20:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[1:20:54] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[1:23:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[1:25:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[1:26:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[1:26:54] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[1:29:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[1:31:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[1:32:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[1:32:54] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[1:35:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[1:37:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[1:38:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[1:38:55] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[1:41:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[1:43:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[1:44:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[1:44:54] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[1:47:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[1:49:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[1:50:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[1:50:54] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[1:53:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[1:55:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[1:56:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[1:56:54] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[1:59:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[2:01:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[2:02:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[2:02:54] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[2:05:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[2:07:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[2:08:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[2:08:55] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[2:11:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[2:13:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[2:14:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[2:14:54] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[2:17:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[2:19:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[2:20:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[2:20:54] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[2:23:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[2:25:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[2:26:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[2:26:54] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[2:29:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[2:31:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[2:32:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[2:32:54] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[2:35:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[2:37:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[2:38:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[2:38:55] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[2:41:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[2:43:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[2:44:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[2:44:54] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[2:47:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[2:49:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[2:50:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[2:50:54] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[2:53:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[2:55:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[2:56:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[2:56:54] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[2:59:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[3:01:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[3:02:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[3:02:54] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[3:05:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[3:07:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[3:08:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[3:08:55] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[3:11:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[3:13:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[3:14:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[3:14:54] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[3:17:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[3:19:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[3:20:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[3:20:54] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[3:23:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[3:25:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[3:26:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[3:26:54] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[3:29:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[3:31:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[3:32:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[3:32:54] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[3:35:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[3:37:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them. It's just like how do we train somebody on the things they should care about, like Reza mentioned, thumbstop being the most important if we're talking videos. But they just become familiar with the metrics and then slowly they'll become familiar with next steps. So that's just a little bit of a context that I just wanted to add on top of there too.

[3:38:37] Reza Khadjavi: So, so you got your hypothesis, you have your learning, you've come up with an ask, you have two very specific, um, guidance that you're offering to the creative team and very easy to action on, should be relatively easy to implement. And then, uh, what happens from here?

[3:38:55] Evan Lee: This is where my version of memorialized comes to life because I think memorialized like as I hear you speak is a very, uh, like broad term. It could mean something for anybody. But in my version, it's like related to the sprints and tests that you're running to be able to say, I did this thing, what happened? Have this learning live on forever, right? So if this is our hypothesis, this now moves into the next step of like validating our hypothesis after we've executed. So this is the recap stage. So you'll see here that I have my week one recap and the first thing that I have is just my validation of hypothesis. So here, what the assumption is is I'm looking at, um, my lavender hallways creative and the change I made, which produced a second version, so a V2, and produced a V3 option. So you would have noticed earlier in this presentation and just to make it simple, I'll go back to show everybody. You would have noticed that I had mentioned a control group, right? So the control group is now the proper time to give you the definition, is the version that we would be able to create the iteration off of. So when we talked about the hypothesis of increasing thumbstop ratio but maintaining conversion rates, my control group became this version down here, which was the original, meaning that it ran for X amount of days, right? Then the second thing that we wanted to do is be able to say, okay, now version two and version three, let's plot those in addition to what my control group was. And then the final thing that we needed to do before actually analyzing the data is making sure that we align the date ranges with whenever version two and version three went live. Because that part will ensure you're looking as close to apples to apples as possible. And once you get to that stage, you're finally able to say to everybody, did we do a good job or not? So in this case, as a reminder, what we wanted to see from the control group was an increased thumbstop ratio, but maintaining the click to purchase ratio. So we can see in V2 here, we were able to achieve that without any issues. Amazing. Pat on the back. Let's continue down that iteration path. Whereas with V3, the thumbnail change didn't lead to the same outcome. So click creating like a clickbaity thumbnail wasn't the route that we should be leading into. So this is the way that we're thinking about before jumping into week over week, 21 day over 21 day, going one to the other, you want to give yourself that time to be able to recap what has happened from that previous week. And just to just give a little bit of context on workflow because I know we have brands here, I know we have agencies in the house as well. Like this is also a great way to not only speak to your creative team and just get them to say like, listen, you did a great job or this is where we're at. It's also on the client end or the management end just to keep everybody in the loop and giving them, again, knowledge is power, the information they need.

[3:41:30] Reza Khadjavi: So maybe I I I like what you said about memorialized and and your your explanation is is very, very specific. Maybe one way to think about it is like, you know, you have at the high level, there's a set of information that's like known team wide that hey, this sort of thing works, this sort of thing doesn't work. And you have that understanding, that's maybe one layer of memorialized learnings. Um, but it nobody can really explain exactly why or what is it about the change that did that or was it like a fluke, for example? Like maybe maybe it was some other factor. But the level that you're describing is like it's going a little bit deeper. It's it's isolating something out so that when you when you memorialize that learning, you can be very specific with with the learning. And so if you if you document these and you have a new new team member joining, they have this like wealth of information that they can go back and say, oh, we tried this, we learned this, this now goes into our like bank of information that makes it really easy to to come up with new ideas and solutions that really hit the mark. So for example, everybody on the creative and the marketing team, there's no reason why they shouldn't know, for example, which which videos have the highest thumbstop ratios. Like in the last few months, everybody on the team roughly knows that like these, what we did here, here and here from a thumbstop perspective did really well. And so you have that shared understanding, that shared learning on the team that won't show you an immediate return that day, but if you have that kind of baseline across the team, then when somebody is coming up with with an iteration, if they if they have an understanding of what has been working well, what has not been working well, those kind of things like you incorporate that into the process and all of a sudden you see people are able to to start flying on their own because they have that shared knowledge. But when the team is shooting in the dark and being like, okay, this first three seconds didn't do well, I wonder why. I wonder what about it was that didn't didn't do well. Were there other ones that are similar to this that did do well or didn't? Um, I think it's really important that as like a backdrop to a lot of these week over week iterations, the more context that is able to flow through the team, the more you'll see that the the impact of these iterations will will have a really, really positive effect if that shared understanding and learning is, uh, is incorporated across the team.

[3:43:35] Evan Lee: Yeah, I'm 100% with you, Reza. And I think like just to springboard off of the two you mentioned, I just have like two additional points there. So the first part is just like carving out the time and head space to be able to say, well, what are we going to do about that thumbstop ratio, right? So I think that's where that sprint cadence we're talking about now allows you to sit down and say, okay, every 21 days, I can look back and actually give myself the head space to say what happened here, right? Like I think that part's important without it just kind of slipping by the wayside and you going back to it randomly, all of that. And then the second part is just internally for all of your teams, it's just aligning on like what creative metrics matter to you the most and then training people up. You'll notice in these screenshots that I've added, I've just kept it super simple. Like I'm not giving anyone like more information than they need to overwhelm them

[45:35] Speaker 2: Yeah, I just, I dumped in the chat too. One of the other things if if somebody if, uh, if we don't have questions, one of the things that could be interesting just to maximize, uh, learning from everybody here, if there's something of like a win process wise that you may have implemented on your team, stuff like this, stuff that are like small wins that but had a really good impact on your team and on your process, would love for you to come up and tell us about it. Would love to hear about that and I'm sure other people, uh, here would also benefit from that. So if there's anything that you and your team have implemented recently with regards to this problem area that has done well for you, uh, would love, would love for you to come up and tell us about it.

[46:23] Speaker 2: These things are no fun if they're not interactive. It's got to be at least one person. Come on. Come on team on the client side. Okay, so Jayesh, uh, Jayesh is our first question here. On the client side, tips on pushing our agency partners to engage in this iterative practice for ongoing optimization. So if you're a brand or your client and you want to push your agency partners to engage in this, what, what are some tips or things that you've seen work well there, Evan? Great question by the way.

[46:57] Evan: Awesome. Jay, it's good to see you, man. Um, ultimately, like with this one, as cheesy as it is, is like the knowledge is power thing, right? So it's just with these hypotheses, I can send the list out after, it's just understanding what data points to look at, make it a lot easier to have those conversations, right? So if we're know we're looking for thumb stop ratios and we see some thumb stop ratios down, on the client side, you need to push your agency to say like, why are these down and why are we not doing anything about it? So I think the data piece will really ensure that you're actually able to, to hold them accountable to that. Awesome, man. Awesome. Um, so, so that's the first piece just related to the data, but I think the other side of it as well is just making sure that there's like, um, like accountability measures in place because ultimately you want your agency to test on an ongoing basis and keep you informed. And if you notice that there's a slippage, it's like you're the client at the end of the day. How do you make sure that you're getting exactly what you need? Um, so I think even talking about the idea of what should we do every couple weeks, uh, could be a really good initial conversation.

[47:57] Speaker 2: Evan, I got a couple more questions. Uh, Carmen, Mike, sorry, I missed them. Uh, I actually wasn't looking in the Q&A box. I was looking in the chat mostly, but we have a few more questions in the, in the Q&A box. So I'll just read them to you, Evan. What other variables do you use to make creative decisions and why?

[48:16] Evan: Great question. Carmen, it's good to see you too. We talked the other day. I think a big thing here is just like, um, for me, I'll always tie it to bottom line. And whether that bottom line is coming from Facebook or coming from GA, coming from a third-party attribution software, like it's always tying it to that bottom line. So that's why you'll see me compare things like thumb stop to that bottom line, CTR to that bottom line, right? But I think another piece of this is just related to like, um, CTRs, CPMs, if you're looking at videos, it's like the percentage of watch times. Uh, I like through plays a lot as well and just looking at your cost per through plays to get an idea of retention. Uh, yeah, off the top of my head, I think those are the ones that come to mind.

[49:02] Speaker 2: Next question for you.

[49:04] Evan: Okay.

[49:05] Speaker 2: Uh, no, I think, I think that covers it. I think that's a really, I think that's a good, uh, that's a good summary. I, I do think it's very much related to the hypothesis first, right? Like thinking about, um, I, I, I really like thinking about it from the standpoint of like, okay, what's, what's a hypothesis or a question or some, something that we have in mind that we think might be true. And then the question becomes like, okay, what, what actual data variables can help us find the answer to that. Sometimes we're, we're lucky and, and the variables are like very easy to find. So for example, the thumb stop piece is a really simple one. Even with, without looking at the, at the KPI of thumb stop ratio, one might have this hypothesis that like, oh, if people are scrolling past my video, if we can capture the attention better, then that would be good because, you know, we know that we have a higher chance to convert them later on. I really like thinking about things from that standpoint. And so it might be related to, you know, the different influencers that we might work with or even things like zooming out a little bit, like, um, concepts, concepts are really interesting, like, you know, unboxing or UGC or other things like that. And so, I think anything that is, anything that is worthy of being a genuine hypothesis is like, is a really good starting point. And then the question becomes, what variables do I need to try to track down in order to look at and validate and investigate this hypothesis that I have. And, and that way, you know, Evan mentioned early on, the analysis paralysis problem, which I think is, is real. And when you just like swim in data, you might have this issue where you're just like, wait, what the hell am I looking for? I'm just looking at all these numbers and I lost track of like, like where am I? What am I, what am I looking to find here? So I really like that stepping out, just like thinking about a problem and then going to the data to try to see, can I, can I find a way to, to pin down the, the insights that might help me prove or disprove this, um, this hypothesis. Um, another question from, uh, from Mark here says, where do these creative sprints live? Uh, so in Notion, uh, in, in apps, like where, where would, where would people run these sprints?

[51:34] Speaker 2: Asana, sorry.

[51:35] Evan: Yeah, I think, I think ultimately it's going to live within the project management tools that are existing within your infrastructure, your client's infrastructure, or like that mutually bound system, right? So that's where you'll track and complete that project management in terms of uploading your brief and the asks, and then therefore it becomes like a solid element there. Um, but for the actual like data output side of things, I think the easiest part, of course, is the motion side of things, but you can just do it in sheets. And I think having sheets with all the important metrics that you have, uh, living there can be the best place for it.

[52:10] Speaker 2: And then last question, uh, from Britt, Britt Ellsworth, uh, great question, hot topic. Where do you consolidate all data from platforms and third-party tracking? What do we do, what do we do there, Evan?

[52:23] Evan: I love these ones because in all honesty, I don't think anybody has the right answer right now to that one, which makes it tricky, right? But I think the biggest thing is just like the trust in determining what your, what your bottom line, um, like decision making will be. So it's like, am I going to listen to GA and what that tells me and whatever attribution I want to use there? Am I going to listen to a different third-party software? Am I going to listen to what it says on Shopify, right? So being able to say exactly what's happening there, and then once you know what you're trusting, uh, I think third-party attribution softwares can group for you. So just seeing what it does on that front. But then once you've trusted it, it's just a lot of that pivot tabling that I was mentioning. So you're downloading the data from that source. So if it's GA, you're downloading it straight from there, and then you're able based on those UTMs to align and UTM or naming convention to bring things together and say that cool, consolidate if this, then that. Uh, and then that way you're looking at it based on that source of truth. I hope that helps. I just know this is a little bit of a hard one to answer.

[53:21] Speaker 2: I want to, I want to toss one, I think I read on Twitter, um, recently, a different idea posted here, which I, which I found was an interesting way to think about this, where basically you have the question of like, what is truth, right? And then you have the question of what does Facebook think is truth? And that's actually quite an important distinction because if Facebook thinks something is truth, then that's what's going to be fed into its algorithm to optimize based on. And so like, it adds one other layer of complexity here a little bit, but it also helps think about, you know, the data, if we make decisions based on data that is outside of Facebook, there is, there is a bit of a risk there too because Facebook isn't, Facebook isn't going to be making decisions based on that. And so it's going to be optimizing based on the data that it has available to, to it, to the algorithm. And so I think when thinking about, when thinking about optimization and performance, I think it is valuable to try to put different pieces of data together to try to come up to an answer about, uh, truth, but that doesn't necessarily mean that if something, if, you know, Facebook said something didn't work well, but every other kind of GA and other places said that it, that it did, it doesn't mean necessarily that you double down on that, you see the same results because as far as Facebook is concerned, that didn't do quite well, right? And so it's, it adds a little bit more complexity, but also, I think, um, hard to just completely abandon in-platform data because of this purpose too, because that's what Facebook's going to be using for optimization. So I think, uh, um, adds, uh,

[55:05] Evan: Well, also, I will also say because like, I still think the the source of, and this could be a hot topic, but like the source of truth is important to determine what matters to you, right? Because like at the same time, it depends what the issue is in terms of how Facebook's optimizing. Like it depends how your campaign structure is and all of those elements, right? Like technically, if you know you've put, and I know I'm going deep on this, so anyone, um, just on like the creative side, it might not make sense, but if you have five ads within a single ad set, and then within the campaign, you have three other ad set with five ads, and then you're not getting spend to a lot of your creatives. Like ultimately, you might just want to try and game the system and say, pull my ad out that I want to try and put that into its own campaign, you know what I mean? Accepting the overlap within top of funnel at least. So, there are elements like that that also come into play, which is why this is a tricky answer.

[55:53] Speaker 2: Yeah, yeah, attribution is a, it's a black hole. It's a tough one. Uh, but, uh, but I think everyone's working through it. Any other questions, um, that we can answer in the last few minutes?

[56:05] Evan: I see one from Carol in the chat. Have you ever ran into issues where the client has their creative team in-house and they aren't pushing enough performance creative out? It's not a priority for them. And how do you explain the value of sprints, iterative testing to them to get them on board? Uh, so with a question like this, I'm going to answer to the best of my ability based on my account management experience at an agency, right? But I think what's also helpful is, I know Jayesh is here, so, so he's on the brand side. Um, his insight would be very, very impactful as well, or anyone else on the brand side willing to jump in. But just to speak to this specific point, I think like starting at the high level and making your way in. So first and foremost is just outlining exactly the situation post iOS 14, creative and landing pages becoming the most important, um, element in terms of driving performance. But then I think where you can get wins with your client is just making them understand that you're not asking for the world. It's like, I want to do this iterative thing and what that means on your creative team's plate is to start off, it just means two iterations on one video that you've already made and I'm only changing three seconds, right? And your goal there is to get some wins, being able to say like, see, these changes were able to get you more dollars. It was able to generate more engagement that then benefited our retargeting. So it's just that dialogue of making sure that the task and the lift associated with that task is actually easy for the brand instead of a headache where they need to get buy-in.

[57:29] Speaker 2: I'll add one thing to that too. I think that's a, that's a really great lightweight way to, to get people on board is to make the ask like very simple, almost undeniably simple. Another way too to just build up that buy-in from other people is just start incorporating data into, into the conversation. So even if for example, you have this like secret agenda where you want to try to push people towards a more iterative creative process, you might start as like the first step just reporting on stuff and telling them like, hey, did you know this? And did you know that? And did you know this happened? And did you know that happened? And I think once you start incorporating these like insights into your conversations, it's almost natural for people to want to like, the curiosity in, in people might be like, okay, well, you know, if, if this is true, then maybe we should try something. And you have, uh, and you have this idea where you're not even suggesting that we should iterate, but you're just kind of showing the data that like, hey, here's, here's some of the things that we found. And maybe that invites people to contribute ideas of like, oh, okay, well, if this is what we're seeing, then maybe that's something we should try. So I think starting to incorporate, uh, data and insights into your conversations is probably a good way to prepare the groundwork for people to, to want to do more of this creative iterative testing. And then yeah, like making it easy and lightweight to do, I think makes it, uh, makes it great. And so with that, we are up on time. Really appreciate everybody, uh, joining. I think we'll have some sort of, uh, some sort of survey type question after this, but really would love to hear your thoughts. We want to do a lot more of this and we want to do it based on what you, what you'd like to see more of and, uh, and what, uh, what you want to see less of. I think we did get through all the slides or did we, did we not get to them, Evan?

[59:22] Evan: I have more that are rooted in hypo in like different hypothesis or hypothesis that you can have.

[59:27] Speaker 2: That's good. So maybe, I think we can round it up in an email.

[59:31] Evan: Yeah, yeah, yeah. I think that's the biggest thing. And if any other questions come up, like, I know myself, um, like feel free to reach out and I got your back. I got your back. So we can go into exactly what type of data relationships you care about and all that stuff.

[59:44] Speaker 2: All right, thanks everybody. Thank you so much for attending and be on the lookout for the next one of these that we do, uh, sometime soon. > [VISUAL: The screen share stops. The two speakers are shown in a split-screen view. Evan is on the left, Speaker 2 is on the right.] Take care.

[59:53] Evan: Amazing. Appreciate it. Thank you everybody. Have a great day. Bye.

[59:57] Speaker 2: Bye.