DiscoverFrom A to B
From A to B

From A to B

Author: Shiva Manjunath

Subscribed: 4Played: 89
Share

Description

Conversations with interesting people across the digital marketing spectrum about all things experimentation.
63 Episodes
Reverse
Finding clients is hard enough - finding GOOD ones? Feels damn near impossible. I enlisted the help of Tas Bober to help navigate this complex field of getting client work, and explored topics related to CRO/client work as well!We got into:- Bringing confidence to the client relationship (you don’t always have to cave in to EVERY little freaking thing they want you to do)- Running from bad clients when they don’t pass the vibe check (and how to find shitty clients BEFORE they pay you)- How to talk to clients (and stakeholders) appropriately without talking DOWN at them (hint: treat them like they’re children. It works.)Timestamps:00:00 Episode Start4:12 Detecting Client Red Flags BEFORE Signing A Contract With Them8:46 What To Do When A Client Gives You The ‘Ick’16:07 Importance of Setting (And Enforcing) Boundaries20:30 Setting Success Metrics for Engagement Upfront29:46 When “Leading With Data” Doesn’t Make Make Sense…?46:20 Tas’ BIGGEST Pro Tip (Don’t Just Skip To This Timestamp Ya Leetches)Go follow Tas Bober on LinkedIn:⁠https://www.linkedin.com/in/tasbober/ ⁠Go check out her podcast "Notorious B2B": https://www.linkedin.com/company/notorious-b2b/posts/⁠Also go follow Shiva Manjunath on LinkedIn:⁠⁠⁠https://www.linkedin.com/in/shiva-manjunath/⁠⁠⁠Subscribe to our newsletter for more memes, clips, and awesome content!⁠⁠https://fromatob.beehiiv.com/
People are finding websites more and more manipulative. And we don't mean this word lightly. Why is this the case? In large part, it's because of the shitty tactics people are using on sites. But really, it's because websites care far more about short term sales and metrics instead of focusing on LTV. And that leads to hacks, not long term gains. Made With Intent released a REALLY great report which went into detail on this, and I had the man, myth, and legend David "Zedd" Mannheim on the pod to chop it up to discuss that report (among other things). We got into:- Why People Feel Like Websites are Manipulative- How Popular 'Best Practice' Tactics Are Probably Hurting You Way More Than Helping- Tips and Tricks on Maximizing Your Popups (WITH DATA)Timestamps:00:00 Episode Start5:05  People Feel Websites Are Too Manipulative11:12 Websites Seem to Prefer Sales Over CX14:40 What Are Pop Ups Getting Wrong19:02 Marketers Are More Focused On Numbers Than Humans26:56 Does Social Proof Still Work In 202531:18 Scarcity Messaging35:55 Discounting Isn’t A Strategy42:02 Segmentation Is One Of The Most Important Things To Focus OnGo follow David Mannheim on LinkedIn: https://www.linkedin.com/in/davidleemannheim/ Download the full report here (it's a REALLY good read):https://www.madewithintent.ai/the-intent-gap-report Also go follow Shiva Manjunath on LinkedIn: ⁠⁠https://www.linkedin.com/in/shiva-manjunath/⁠⁠Subscribe to our newsletter for more memes, clips, and awesome content! ⁠https://fromatob.beehiiv.com/
We back out here with the amazing Emma Travis to dig into 'quality vs. quantity' when it comes to experimentation programs. The conversation Ben Labay keeps trying to force, but a bit more nuance to it. There is a time and place for quality. But there is a time and place for velocity and scaling your program up in terms of number of tests run. How do you know when to do it? And what are triggers to get you .... ahem.... From A to B. Emma "Biscuits and Gravy" Travis joined me on this episode to go DEEP into this topic. Emma was sick and still trooped through this episode - give her all the flowers!!!We got into:- When to focus on velocity vs. when to focus on ‘quality’ of experiment- Tips on auditing your program to know if you’re too focused on quality vs. quantity- Smart ways to focus on research when resources are limitedTimestamps:00:00 Episode Start3:57 Is Velocity Appropriate In Tracking Metric For Experimentation Maturity?9:23  Velocity Increase Doesn't Always Mean A Decline In Quality11:51  Experimentation Velocity Is Like Flying A Plane19:19 When To Scale Velocity Vs. When To Focus On Quality24:33 Run A Heuristic Audit Of Your Cro Program (*Bonus Points For Doing It In Miro)28:00 What Defines "Quality" In A Test?36:35 How Do You Communicate A Decline In Velocity Knowing You're Increasing Quality?40:35 Good Ways To Use Ai To Help You Speed Up Research Go follow Emma Travis on LinkedIn: https://www.linkedin.com/in/emma-travis/ Also go follow Shiva Manjunath on LinkedIn: ⁠https://www.linkedin.com/in/shiva-manjunath/⁠Subscribe to our newsletter for more memes, clips, and awesome content! https://fromatob.beehiiv.com/
How many of you have seen a website teardown and thought “Dude you literally have 0 data, this is just a vibes analysis?”I have that thought regularly. And it got me thinking about heuristic audits as a whole, and trying to understand what makes a good heuristic audit good, and a bad heuristic audit bad. I enlisted the help of my mortal enemy Rishi Rawat to explore this. Vibes were immaculate, and he actually said some smart things in this episode (shocker - I know). We got into:- The critical missing ingredients to take your heuristic from ‘shit’ to not only trustworthy, BUT valuable- Where many heuristic audits fail to be meaningfully impactful (despite being positioned as such)- An exploration of “expert” vs “non-expert” heuristic audits (and why both are incredibly valuable)Timestamps:00:00 Episode Start2:08 What Is a Heuristic Audit (With Help from Ryan Thomas)8:53 Good Heuristic Audits Are Systematic in Nature12:22 The "Goal" of the Heuristic Audit May Be More Important Than the Process16:20 The Importance Of Data to Informing The Heuristic Audit22:01 Rishi Actually Says Something Smart27:07 Heuristic Audits Should Highlight BOTH The Good AND The Bad35:29 Many Heuristic Audits Don't Simulate Emotion (Yet They Should....)Go follow Rishi Rawat on LinkedIn: https://www.linkedin.com/in/rawatrishi/Go check out the homie Ryan Thomas: https://www.linkedin.com/in/ryancharlesthomas/ Also go follow Shiva Manjunath on LinkedIn: ⁠https://www.linkedin.com/in/shiva-manjunath/⁠Subscribe to our newsletter for more memes, clips, and awesome content! https://fromatob.beehiiv.com/And go get your free ticket for the Women in Experimentation - you might even be entered to win some From A to B merch! : https://tinyurl.com/FromAtoB-WIE
Problematic stakeholders could be convinced if they... just got data. Right? Nope. Sometimes it's deeper than that. You gotta understand what makes them tick. Is it data? Is it optics? Maybe you should be running heuristic audits on your stakeholders instead of your website...In the second time EVER, someone else has changed my mind on something. Leave it to "Kingpin" Finn McKenty, who's apparently gone from Punk Rock MBA to Finn McKenty PhD, to change my mind.We got into:- How YouTubers are more data driven than your own CEO (lol)- Why you should be running heuristic audits not JUST on your website, but on the stakeholders you interact with (and tips to do so effectively)- Finn gives some general life advice on learning to let go (important when many product and CROs don't have autonomy to actually impact anything)Timestamps:00:00 Episode Start2:46  The analogy of CRO and "Gym" goes so deep6:25 Even YouTubers are data driven11:20 People who don’t buy into “experimentation” just optimize for different metrics than you14:28 Psychology of UXers vs. Product/CRO (Finn low key is a psychologist now)20:01 Running heuristic audits on… stakeholders? (yes - it’s a good idea)25:07 Optimization sometimes means optimizing for ‘helping people’ (not metrics)30:16 Sometimes, CROs gotta play the politics game35:13 Finn offers sage advice in learning how to let go (CROs need to hear this)49:06 Preach: Samuele MazzantiGo follow Finn McKenty on LinkedIn: https://www.linkedin.com/in/finnmckenty/ And go subscribe to his newsletter:https://finnmckenty.beehiiv.com/ Go check out Samuele Mazzanti's post too: https://tinyurl.com/FromAtoB-SamueleAlso go follow Shiva Manjunath on LinkedIn: ⁠https://www.linkedin.com/in/shiva-manjunath/⁠Subscribe to our newsletter for more memes, clips, and awesome content! https://fromatob.beehiiv.com/And go get your free ticket for the Women in Experimentation - you might even be entered to win some From A to B merch! : https://tinyurl.com/FromAtoB-WIE
Do you REALLY know what cookies are? Like really, REALLY know? What about GDPR? What about PII?I know the words. But what do they REALLY mean? I enlisted the help of Eddie "The Techie" Aguilar to help me simplify some of these complex topics, and help me create meaningful next steps on how to address PII concerns and other marketing-related issues in data collection. We got into:- Simplified definitions of cookies, data collection, GDPR, etc. (I'm stupid and like hearing things simplified from smart people)- First vs. Third part cookies (and what it means to your marketing program)- A/B testing and the importance of NOT collecting PII in your testing toolsTimestamps:00:00 Episode Start2:31 What is a Cookie?7:41 How Cookies Have Been Used Maliciously (Lack of Consent)9:51 First Party vs. Third Party Data13:11 Opting Out of Cookies (Explained)14:45 GDPR28:20 A/B Testing and Cookies37:30 PII and A/B testingGo follow Eddie Aguilar on LinkedIn: https://www.linkedin.com/in/whoiseddie/ Also go follow Shiva Manjunath on LinkedIn: ⁠https://www.linkedin.com/in/shiva-manjunath/⁠Subscribe to our newsletter for more memes, clips, and awesome content! https://fromatob.beehiiv.com/And go get your free ticket for the Women in Experimentation - you might even be entered to win some From A to B merch! : https://tinyurl.com/FromAtoB-WIE
It happens rarely, but somehow Linda was able to change Shiva's mind this podcast. And we got it ON THE RECORD. What did she change his mind on? Linda "The Winna" Bustos brought some compelling conversation to the fact that company CAN differentiate in 'cookie cutter' websites (e.g. Shopify) in really, really creative ways. We got into: - The importance of copy when you're using a generic 'cookie cutter site' (e.g. Shopify templates)- Letting your brand shine through to help you differentiate (including some hilarious examples of how some brands have accomplished this)- Cool trends Linda has been seeing in ecom spaceTimestamps:00:00 Episode Start2:58  Shopify Is Basically A "Paint By Numbers" Website Generator7:41  Using Shopify When Your Product Truly Is Unique9:53  Tips On Stand Out From The Crowd On Product Pages12:40  Mattresses Is An Industry To Keep An Eye On18:56 Shiva Glazing Liquid Death (For A Good Reason Though)21:23  Other Creative/Creative Brands Which Have Done Cool Shit27:44 Make Your Copy Memorable (And Palace Has Amazing Copy)30:30 How Do Luxury Brands Differentiate?36:01  Product Finders Can Be A Differentiator41:03 Do Better Competitor Comparison ChartsGo follow Linda on LinkedIn: https://www.linkedin.com/in/lindabustos/And check out her website Ecom Ideas: https://ecomideas.com/Also go follow Shiva on LinkedIn: ⁠https://www.linkedin.com/in/shiva-manjunath/⁠Subscribe to our newsletter for more memes, clips, and awesome content! https://fromatob.beehiiv.com/If you have listener questions, submit them at https://tinyurl.com/askfromatob for a chance to be featured too!
Do product/CRO people REALLY know well? There may be some Dunning Kruger happening, and maybe we really don't get why they keep complaining about our initiatives. Spoiler: Many times, they have a VERY legitimate point! Who better than the freaking VP of Nielsen Norman Group "Checkmate" Kate Moran to come dish all the things we're getting wrong about UX, and how to better work together. We got into: - What product teams/CRO can learn from UX, and vice versa- Why product teams really need to focus on longer term relationships with the customer (and not chasing short term gains)- Amazing (and not so amazing) use cases for AI in UX Timestamps:00:00 Episode Start1:52 How Does UX View Experimentation / CRO?6:51 What Are The Biggest Problems UX Faces As An Industry?13:03 What Can CROs/Product Learn from UX (And Vice Versa)17:22 UX/Product Share This In Common: No One Knows What The Fuck We Do20:55 AI in UX - DON’T (!!!) Blindly Trust It26:56 What Is “Google Gullibility”? 30:51 Context Matters in UX38:05 Shiva Is Kinda Agegist (My Bad lol)Go follow Kate on LinkedIn: https://www.linkedin.com/in/kate-m-moran/ And check out Nielsen Norman's new live online courses!https://www.nngroup.com/training/live-courses/ Also go follow Shiva on LinkedIn: ⁠⁠https://www.linkedin.com/in/shiva-manjunath/⁠Subscribe to our newsletter for more memes, clips, and awesome content! https://fromatob.beehiiv.com/If you have listener questions, submit them at https://tinyurl.com/askfromatob for a chance to be featured too!
How many of y'all feel a bit nervous about going to in-person conferences / networking events? André Morys and I have been there before. Would you believe that "Sensei" André is actually an introvert?! Could've fooled me 👀I wanted to get feedback from someone who RUNS conferences to figure out what tips and tricks we can utilize to get the most out of in-person events, ESPECIALLY coming from two people who are actually pretty introverted (if you can believe it...)We got into: - Tips for introverts to "get out of your own head" at conferences / in-person events- Feeling imposter syndrome at conferences (and how to break the nasty mental hold it has on you)- Being curious about everyone and everything is probably the BIGGEST tip you can take awayTimestamps:00:00 Episode Start3:32 Rules for Conferences (NOT Tips - obey these always)  9:01 How to Be Your Authentic Self At Conferences 12:54 Tips for Introverts at In-Person Events / Conferences18:24 Feeling Imposter Syndrome at Conferences27:05 Don't Overthink - Just Be Curious (When Interacting With Humans lol)29:32 More Tips for Introverts (And Extroverts - It’s Your Job to Help Introverts)37:40 How Do You Shut People Up Who Just Keep Flexing / Selling50:00 Pace Yourself - The Value of Conferences Happens AFTER The Talks Go follow André on LinkedIn: https://www.linkedin.com/in/andre-morys/And if you're interested in signing up for the GMS conference (using the promo code André shared in the episode, which to remind y'all I get 0 kickbacks or dollars from lol) - go buy a discounted ticket! https://www.growthmindedsuperheroes.com/home Subscribe to our newsletter for more memes, clips, and awesome content! https://fromatob.beehiiv.com/Also go follow Shiva on LinkedIn: ⁠⁠⁠https://www.linkedin.com/in/shiva-manjunath/⁠If you have listener questions, submit them at https://tinyurl.com/askfromatob for a chance to be featured too!
What REALLY is the impact of leadership in growing an experimentation? And is "stable leadership" something which may actually be messing up your ability to grow your program?This was something I wanted to enlist the thoughts from the "Taco King" himself, Lukas Vermeer. We went into some great thoughts and conversations around building trust, being perceived as a credible person to help you grow your experimentation program, and other important things related to leadership and experimentation. We got into: - How Lukas REALLY built anti-fragile experimentation at Booking.com (and how some parts MAY have been influenced by the ... Dutch Mafia?!)- The impacts of working from home to building and scaling experimentation (spoiler: it DOES make shit harder)- "Trust" and "Credibility" are two VERY important things in growing experimentation - how you actually GET those, well Lukas and Shiva have thoughtsTimestamps:00:00 Episode Start3:02 The Leadership at Booking.com (and how it WAS stable)8:57 Which Impacts Experimentation More? Stability or Trust?14:54 The Impact of Working From Home to Growing an Experimentation Program24:39 Making Experimentation Anti-Fragile (Along With A Fun Lukas Story About It)40:20 What CAN CRO Do When Leadership is Unstable47:29 How to be Credible Without Being Annoying51:03 “PREACH” : Manuel De CostaGo follow Lukas on LinkedIn: https://www.linkedin.com/in/lukasvermeer/ And go follow Manuel on LinkedIn as well! https://www.linkedin.com/in/manueldacosta/ Also go follow Shiva on LinkedIn: ⁠⁠⁠https://www.linkedin.com/in/shiva-manjunath/⁠If you like this content, go nominate us for a community award at: https://experimentationcultureawards.com/ Subscribe to our newsletter for more memes, clips, and awesome content! https://fromatob.beehiiv.com/If you have listener questions, submit them at https://tinyurl.com/askfromatob for a chance to be featured too!
"Running a Solo CRO program is easy"- No one ever, in the history of the world Erika Sentz (nicknamed "pancakes" for a very good reason...) and Shiva Manjunath have seen their fair share of Solo CRO programs, and wanted to go into tangible tips and tricks you can take away to action TODAY to help you have less stress and get more done as the only person managing that program. We got into: - The importance of building and maintaining relationships as you scale your program. And how exactly to do this!- Where "delegation" can actually be a core unlock to maintaining your sanity AND improving the quality of experimentation- Communication is maybe the most important thing when you're all alone lol- CRO is like.... making pancakes?! Yes. Yes it is.Timestamps:00:00 Episode Start3:00 The Hardest Thing About Being a Solo CRO8:17 Auditing Your Own Program Is Key11:34 Importance of Expectation Setting with Stakeholders17:16 Erika Keeps Talking About Pancakes21:12 Topic Detour: A/B testing PEOPLE on Websites22:51 Communication as a Solo CRO32:26 Building Relationships is The Biggest Unlock42:09 “PREACH” : Natalie A. ThomasGo follow Erika on LinkedIn: https://www.linkedin.com/in/erikasentz/And go show Natalie some love on her LinkedIn Post: https://tinyurl.com/FromAtoBNatalieIf you like this content, go nominate us for a community award at: https://experimentationcultureawards.com/ Subscribe to our newsletter for more memes, clips, and awesome content! https://fromatob.beehiiv.com/If you have listener questions, submit them at https://tinyurl.com/askfromatob for a chance to be featured too!
Do people have legitimate pushback to A/B testing? I mean, some don't. They just hate being proven wrong. But there are others who DO have legitimate pushback. How do we address this?Ton "The Don" Wesseling offered a ton of helpful thoughts on how we could actually push back on "logical" (but incorrect) reasons why people don't want to A/B test.We got into: - Pushing back on "not all tests win" without sounding like an asshole- Pick your battles wisely, and invest time in those who WANT to test (while don't prioritize those who DON'T want to test... for your own sanity)- Legitimize their concern. Don't talk down to them like they're wrong, because that doesn't help you, nor does it not help themTimestamps:00:00 Episode Start04:08 Pushing Back on "The Test May Not Win"09:50 Navigating Concerns on Testing Redesigns16:11 Addressing Pushback When It's One vs. Many People (to A/B testing)25:00 Someone Created Your CRO Role - Go Figure Out How To Be Helpful32:35 When "Winners" Don't Affect Your Bottom LineGo follow Ton on LinkedIn, and go nominate people who you think deserve Experimentation Culture awards run by Ton and others! (and feel free to nominate us there….)- https://www.linkedin.com/in/tonwesseling/- https://experimentationcultureawards.com/ Subscribe to our newsletter for more memes, clips, and awesome content! https://fromatob.beehiiv.com/If you have listener questions, submit them at https://tinyurl.com/askfromatob for a chance to be featured too!
Pretty simple to make a hypothesis, right? If/Then statement, and boom, done. I already did most of the hard work in just having the framework.Not quite.A lot of work (and pre-work) SHOULD go into making a hypothesis, so that it can go from a "meh" but functional hypothesis, to a hypothesis that not only drives results, but trust in your experimentation program too! And bonus: Juliana "Run the Jewels" Jackson offers 2 columns / buckets of info you probably never considered which should be available for stakeholders to improve on the presentation of the hypothesis and test doc as a whole. We got into: - What exactly IS a hypothesis (good AND bad) - Why your hypotheses actually suck (and how to make it better)- Going beyond the hypothesis to make sure your test ideas are matching what stakeholders are looking forTimestamps:00:00 Episode Start07:24 Hypotheses Need to Reflect What the Business Truly Cares About12:54 Example of a GOOD Hypothesis20:21 Do Hypotheses Need to Be MORE Tactical? Or More Strategic? (In Their Wording?)27:18 Juliana's Two "Can't Miss" Buckets of Information You MUST Have for Every Test34:31 Shit You Need to Know: Haley CarpenterGo follow Juliana on LinkedIn, and check out her amazing “Standard Deviation Podcast” too!- https://www.linkedin.com/in/juliana-jackson/- https://www.linkedin.com/company/standard-deviation-podcast/ Make sure you go follow Haley and check out her post too! https://tinyurl.com/FromAtoBHaleySubscribe to our newsletter for more memes, clips, and awesome content! https://fromatob.beehiiv.com/If you have listener questions, submit them at https://tinyurl.com/askfromatob for a chance to be featured too!
Why are we wasting spaghetti? It's delicious! We shouldn't be throwing it at the wall...Well Ryan "Big Brain" Lucht definitely is a big time lover of playing with food. And he makes a compelling argument for spaghetti testing, which delved into conversations into prioritization frameworks, should we TRY to "test everything," and other topics.We got into:-Spaghetti testing (Should you do it? Or nah?)- Pure experimentation jobs are hard. And you’ll have to fight a LOT. But Ryan and Shiva have tips to help you on this journey- What we can (and can’t) learn from big companies who have mature testing programsTimestamps:00:00 Episode Start02:31 What Is Spaghetti Testing?09:33 Ryan Argues More "At Bats" (Tests) Needs To Be Priority (And He May Have A Point...)19:30 There Are Some Major Things We Can Learn From Mature Experimentation Programs26:10 Ryan Explains Experimentation Roles Can Be Frustrating and Demoralizing (WE KNOW RYAN....WE KNOW)33:44 More Exploration to Things We Can Learn from Mature Experimentation Programs43:51 Shit You Need to Know: David BlandGo follow Ryan Lucht on LinkedIn.- https://www.linkedin.com/in/ryanlucht/And check out some of the cool thought leadership on Experimentation hosted by Eppo:- https://www.geteppo.com/outperformMake sure you go follow David Bland and check out his post too!- https://tinyurl.com/FromAtoB-DavidSubscribe to our newsletter for more memes, clips, and awesome content! https://fromatob.beehiiv.com/If you have listener questions, submit them at https://tinyurl.com/askfromatob for a chance to be featured too!
What does it ACTUALLY mean to be "data driven", and are #CROs adequately data driven? Should we be MORE data driven? Are there some opportunities for us to actually leverage intuition more? Will Shiva keep asking Bhavik "The Sick" loaded questions?Short answer? Yes. Bhavik "The Sick" Patel and Shiva Manjunath discuss this pretty interesting topic, which may seem controversial, but there is actually a decent amount of truth to it. We got into: -What it ACTUALLY means to be data driven. Spoiler: the bar is pretty freaking high- Where are the legitimate (and not so legitimate) ways to utilize "intuition" in the CRO process- Your coworkers with domain expertise MAY have a point to override (some) data. As long as it's not entirely ego driven, we should hear them out!To be clear - we definitely aren't saying "throw away all data and never use it - trust intuition always forever for everything". There is some middle ground to be had, with some very legitimate reasons (some practical, some not) to introducing 'expertise' into the conversation IN ADDITION with data. There is always some middle ground 🙈 Timestamps:00:00 Episode Start02:39 Picking The Wrong Metrics = Not Data Driven6:25 Being "Data Driven" is a High Bar to Clear13:09 Do CROs NEED to be... TOO Data Driven?19:13 When the Results Don't Align With Basic Intuition (e.g. Twyman's law)24:22 Post-Production Shiva Clarifies what HARKing Really Means33:13 A Good CRO Has A Healthy Amount of Skepticism - Even When The Results Make Sense43:03 Shit You Need to Know: Nils StotzGo follow Bhavik LinkedIn.- https://www.linkedin.com/in/dodonerd/Make sure you go follow Nils Stotz and check out his post too! https://tinyurl.com/FromAtoBNilsSubscribe to our newsletter for more memes, clips, and awesome content! https://fromatob.beehiiv.com/If you have listener questions, submit them at https://tinyurl.com/askfromatob for a chance to be featured too!
If you squint your eyes really hard, there is something to the fact that CROs are simply Product Managers in "stats" clothing.... right? What ARE the actual differences between Product Managers and CROs? And how can CROs leverage these differences? Or how can CROs work more efficiently with Product teams?I enlisted the help of Matt and Moshe (M&M), the experts in Product Management, to explore more of this (in addition to... MUTINY?! 🏴‍☠️)We got into: - Messaging matters. How CROs communicate, and work, with product managers dictates the outcomes. Be nice? Communicate well? Watch the relationship flourish and outcomes turn fantastic- How can CROs gain the trust from product? And how much of 'lack of trust' is on product to... well, do better?- Can we actually learn something from sociopaths in how we work / manage stakeholders? (Yes... yes we can)Timestamps:00:00 Episode Start05:18 CROs are effectively... product managers. Right?!13:55 How CROs HELP product managers by taking their tasks 18:51 Many product managers simply don't know HOW to A/B test.... (and that's OK)25:17 How can CROs and PMs work together then... 34:45 The old "tell me how a test will do before we run it" conversation45:27 Shit You Need to Know: Khalid SalehGo follow Matt and Moshe on LinkedIn. And DEFINITELY go check out their "Product for Product" podcast too: - https://www.linkedin.com/in/mattgreenproduct/ - https://www.linkedin.com/in/mikanovsky/- https://www.linkedin.com/company/product-for-product-podcast/Make sure you go follow Khalid Saleh and check out his post too! https://tinyurl.com/FromAtoBKhalidSubscribe to our newsletter for more memes, clips, and awesome content! https://fromatob.beehiiv.com/If you have listener questions, submit them at https://tinyurl.com/askfromatob for a chance to be featured too!
CROs are infallible. We believe data! We trust science over opinions! We are LITERALLY never wrong, right?Well, sometimes we may be right, but we can still be rude about it. 🫣 And no one wants to work with that one guy who, while they may be right, has to let everyone know how right they are EVERY. DAMN. TIME. I enlisted the help of Michael Aagaard, nicknamed "Shrek" (for very good reasons...) to explore how it is that some #CROs may actually be problematic. And how their own behavior may actually be the reason why CRO and experimentation isn't being adopted or used in an organization. We got into: - Messaging and tone matters. You may be right, but HOW you say it may be the difference between "let's run that test" and "I fucking hate that idea"- You're in the wrong industry if you are "testing to be right"- Balancing "confidence" without being "arrogant" as a CROTimestamps:00:00 Episode Start04:46 Soft Skills Which Indicate You May Be a Bad CRO13:05 Collaboration is Key - SEEKING Out That Collaboration Is King 18:26 It's Not What You Say - It's How You Say It26:05 Risk Tolerance IS Different for Everyone - Don't Project YOUR Risk Tolerance for Others33:25 Shit You Need to Know: Jason Haddock (How Layoffs Impact CRO)Go follow Michael Aagaard on LinkedIn (https://www.linkedin.com/in/michalaagaard/)! And go check out Jason's post too! (https://tinyurl.com/FromAtoBJason) Subscribe to our newsletter for more memes, clips, and awesome content! https://fromatob.beehiiv.com/If you have listener questions, submit them at https://tinyurl.com/askfromatob for a chance to be featured too!
Are you doing 'strategic #CRO' ? What the hell even is strategic CRO? And how do you get there? Well, Talia Wolf had a post previously that triggered this discussion, so I had her on to better flesh out this idea. If you want to learn how to get from tactical to strategic experimentation, this is a can't miss episode.Plus - we went into some spicy convo about website carousels.....We got into:- What rituals define 'tactical' vs. 'strategic CRO- What is better to start with: qualitative or quantitative research?- Tips on breaking out of the 'tactical' hamster wheel to get you into strategic experimentationTimestamps:00:00 Episode Start07:10 Shiva Triggers Talia (Bad Definitions of CRO Box CRO into Tactics)11:40 What ARE Tactics vs. Strategy?18:43 Qualitative Data Should Be First - NOT Quantitative26:41 Ways to Identify If Your Program is Tactical vs. Strategic35:04 Shit You Need to Know: Carousels... CAN you actually use them?Go follow Talia Wolf on LinkedIn (https://www.linkedin.com/in/taliagw/)! Subscribe to our newsletter for more memes, clips, and awesome content! https://fromatob.beehiiv.com/If you have listener questions, submit them at https://tinyurl.com/askfromatob for a chance to be featured too!
In many ways, CRO and UX have overlap. But how different are they? And how does UX view CRO? We enlisted the help of Victoria Sakal to go deep into this conversation to understand this from a UXer's perspective. Victoria 'The Warrior' Sakal shared some VERY interesting perspectives which can help CROs and UX collaborate more deeply, among other golden nuggets of information....We got into:- How UXers view CROs (rightly, or wrongly...)- Where UX and CRO actually overlaps (spoiler: they share many things in common)- Ways for UX and CRO to strengthen their collaborationTimestamps:00:00 Episode Start02:08 What IS CRO To UXers? How Do Others in the UX Space View CROs?08:51 UX Seems to Look An Awful Lot Like What CRO Is....14:20 UXers Should Embody CRO Principles17:40 Why Do (Some) UXers Ignore Data? 22:37 How to Talk To Defensive UXers 31:45 There is a Need For A Shared Database of Insights Across UX / CRO40:21 Shit You Need to Know: Sandip AmlaniGo follow Victoria Sakal on LinkedIn (https://www.linkedin.com/in/victoriasakal/)! Also make sure you guys are following Sandip Amlani on LinkedIn - link to his post here: https://tinyurl.com/FromAtoB-SandipSubscribe to our newsletter for more memes, clips, and awesome content! https://fromatob.beehiiv.com/If you have listener questions, submit them at https://tinyurl.com/askfromatob for a chance to be featured too!
Your tests keep on getting blocked. You're frustrated, shaking your fist at the screen, calling everyone a doo-doo head. We've all been there. It sucks. But on the heels of Gertrud Vahtra giving a talk at Experiment Nation on how to help getting alignment on testing by running workshops, I HAD to have her on the podcast to go deeper into this topic. We got into: - Are your test approval rates too high? Too low? What does that mean for your experimentation program? - How to appropriately redirect stakeholders to not disrupt current tests in-flight while also keeping them engaged in the process - Where being passive aggressive CAN be helpful (and Gertrud DEFINITELY approves of this, and Shiva isn't making it up) Timestamps: 00:00 Start of Episode 1:38 Shit You Need to Know: Gerda Vogt-Thomas 8:34 Is Lack of Business Alignment on CROs to Fix...? 14:00 How Do You Balance Many Stakeholders Without Having TOO Many "Cooks in the Kitchen" 20:51 Your CRO Program Management Lacks Meta-Data, Which Impacts Transparency and Trust in Your Program 27:53 Does Your "Idea to Experiment Run" Ratio Need Improvement...? 33:01 Establishing Clear "Drivers" Helps Reduce Blocking in Experimentation Approvals Go follow Gertrud Vahtra on LinkedIn (https://www.linkedin.com/in/gertrudvahtra/)! And check out her Experiment Nation talk too! (out soon) Also make sure you guys are following Gerda Vogt-Thomas on LinkedIn - link to her (super triggering) post here: https://tinyurl.com/FromAtoB-Gerda If you have listener questions, submit them at https://tinyurl.com/askfromatob for a chance to be featured too!
loading
Comments