Ask Women in Product: How do I improve my analytical skills and interpret data to build a feature?

Photo by Luke Chesser via Unsplash

Answer from Amy Lin

Introduction

How are data used in product management?

  • Ideation: to explore unknown opportunities.
  • Prioritization: to evaluate an opportunity so that its impact becomes comparable with others; and to set a clear, measurable goal for an initiative that represents real impact.
  • Post-launch evaluation: to measure the performance of a feature, or to evaluate results from an experiment to decide on the next course of action.
  • Transactional data: These are data generated organically as your application functions, such as order data or user data. They are the data necessary for the application to function properly. As long as you can gain read access to these databases (or a copy of them), you will already have them.
  • Web Traffic data: These can usually be tracked automatically with tools like Google Analytics.
  • Behavioral data: These are data that may not be typically tracked, like the clicks on a specific button or the number of impressions on a certain module. Behavioral data can be tracked using a custom event server or through tools like Amplitude or Google Tag Manager. In my experience, you’ll typically need to manually specify what you want to track, though there are tools that claim to do this automatically.

Approach

1. Start with a question to answer

  • “A better onboarding experience will activate more new users.”
  • “A better onboarding experience will turn more new users into active users (where active users are users with ≥10 content clicks/day).”
  • “A better onboarding experience will turn more new users with < 5 content clicks/day into users with ≥ 10 content clicks/day.”
  • How many new users turn into active ones historically?
  • How confident are you that a new onboarding experience will successfully convert more new users into active users?
  • How many new users, at a minimum, would you need to convert to active users to achieve the goal?

2. Make sure you get the context right

3. Beware of potential external influences

4. Don’t lose sight of the human factor

Troubleshooting / Q&A

1. What if I don’t have the data?

  • Take a step back and look for alternatives: do you have anything that is close enough to what you are looking for? Sometimes behaviors manifest themselves in another form in other data sources available to you. For example, if you want to find the CTR of a specific button but that metric is not tracked, you may still be able to infer the CTR if you also know that the page this button leads to can only be landed through this specific button (assuming direct traffic — visits via typing URL directly — can be ignored). In such a scenario, the number of pageviews of the landing page should be roughly equal to the number of clicks on that button. That sounds good enough to get a ballpark of how the button performs. When looking for proxies, don’t be afraid to get creative and think wild!
  • Plan for the future: If you cannot find a viable alternative, can you make an estimate and add tracking to see if you need to adjust your estimate? All too often, you find yourself wishing that some numbers are tracked when they are not. What you can do instead is to evaluate if it is worth the while to add tracking for this component for the long-term good. Also, if your development resources are tight and you don’t mind getting your hands dirty, check out Google Tag Manager, which lets you set up tracking events on a web UI after a one-time setup (very similar to Google Analytics’ JS tracking code) so you can see reports in Google Analytics. It does require some understanding of HTML/CSS and JavaScript, but you’ll be thrilled by how much more you’ll get to explore with the data that it collects.

2. What do I do when the data doesn’t prove (or disprove) my hypothesis?

  • Differentiate the seemingly indifferent: it is entirely possible that the change doesn’t make a difference to the users. It is also possible that it does, but the difference is not manifested in the main metric you’ve chosen. It’s in situations like this when the supportive metrics and other behavioral data can help you see if the users are secretly finding the new treatment more likable. For example, if you believe the new copy in the headline is much clearer than the original, what behavior should you see? Does the call to action (CTA) below the headline get more clicks because the users now know what they can do with your product? Do users traverse more pages in a session? Does session time lengthen? Factoring in the technical effort/debt of keeping it or removing it, you can conclude if the change is worth keeping.
  • Find out what isn’t working and what you want to do about it: your experiment is not working. Do not despair! There is still hope. Carefully examine what went wrong: did the treatment actually prevent users from doing what they were doing? What might have led to an unwanted result? Once you have a few hypotheses about what might not work, can you iterate on them? Can you address those issues in a timely, impactful manner?
  • Avoid the same mistakes: even if you decide not to roll out this feature in the end, you still take away new learnings from this experiment. What was it that you thought would work but turned out to be a dud? What did you learn about the users? How can you and your company avoid making the same mistake down the road? What should we do differently next time? Even if this treatment doesn’t end up serving our users, it is not in vain if we can bring lessons learned into the next one.

Conclusion

Do you have a question? Ask Women in Product!

--

--

--

A global community of women working in Product Management.

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Noun Project x Inkbox: Iconic Tattoos for Valentine’s Day

Gaudi Inspiration

The ‘How to grow a Branch’ publication by Passport Design Bureau for Natwest Bank

NextWoo is a Powerful Woocommerce Builder for Elementor

Prework Challenge 2 : Wireframing

Interfaces and Incentives: An Analysis of Brown Banner

The Brand Identity Design Process

ASSESSMENT 02B MODEL AND SCALE

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Women in Product

Women in Product

A global community of women working in Product Management.

More from Medium

Assumptions, the downfall of success.

Meaningful performance reviews

Life through the lens of a Product Manager

The Great Maze: Navigating Progression in Your Product Career — Introduction