Narmada Jayasankar
Atlassian Head of Product ManagementMarch 27
At every level, the core PM competencies of driving clarity and alignment through clear communication, delivering measurable outcomes for the business and influencing without authority remain the same. But the ambiguity within you are required to clarity, the impact of the outcomes you drive and the seniority of the stakeholders you influence increase as you become more senior. If your are a people manager, the size of the PM team, and the diversity of PM roles within the team, increases as you become more senior and you will need to start thinking about strategies to grow and retain talent at scale. In a nutshell, it's the same core competencies, but you are expected to operate with greater ambiguity, deliver greater impact and have broader influence commensurate to your seniority.
...Read More
617 Views
Upcoming AMAs
Shahid Hussain
Google Group Product Manager, AndroidMay 22
No-one can, or should ever be sure that they have a 100% right product strategy. But you can do a lot to de-risk your approach, and your tactics should vary depending on how much time you have to plan. * Is your strategy ultimately going to drive the change in behaviour you want? Find the key participants in your strategy -- e.g. the customers -- and talk, talk, talk to them. You'll learn a ton from the first 5-10 conversations, and suddenly you'll start to hear the same themes and be able to predict what they'll say. Then you can move on. * Read, and connect with people who are familiar with this situation in your industry or other industries. How did things work out? Is the current market / environment similar enough that you can draw conclusions? * The more experienced you are, the more confident you can be about relying on product intuition. A phrase I often use is "we've seen this movie before" and, it's surprising how many times the same situation gets repeated.
...Read More
3553 Views
Tanguy Crusson
Atlassian Head of Product, Jira Product DiscoveryDecember 19
You should think of it as: it should be ready to be shipped when it's the first shittiest version, and when it's the best version of itself. The question should not be not so much about readiness of the solution, but readiness for whom? I've witnessed teams who got too excited when creating a new product and opening the floodgates to let anyone try it too early. That usually doesn't end well... You don't want many people jumping on the solution when it's not ready for them: first impressions won't be good, they won’t stick around and you'll need to work hard to get them to try again. Here's the approach we take in my teams: basically, for anything we ship, whether that's big new features to Jira Product Discovery or new products (including Jira Product Discovery itself), we work with progressively more customers (10-100-1000) before making it generally available. It helps us test the solution very early with a handful of customers to get feedback and make sure the solution delivers on its promise for them, and ensures that everyone only gets the solution when it's ready for them - thereby increasing retention & minimizing churn. If you want to learn more you can watch this talk I gave about the process we went through when creating Jira Product Discovery. You can also read more about this topic in the Atlassian Product Discovery handbook, that we wrote to help with things like this. 0->10 customers (preview) In the first phase of early stage bets we only work with max 10 customers. We unpack the problem and test solutions and iterate fast, which is a process that’s best done with a small number of users/customers that feel the problem the most. And we get the solution right for them. It’s easier and much more focused to do it this way than “throw it out there and see what sticks”. Users who feel the pain the most will be happy to work with us, we can chat with them on Slack/Zoom/Email easily. They're happy to work with incomplete solutions (we can take a LOT of shortcuts) and we can remove the need to pile on untested assumptions. If the solution doesn't work we can throw it away, it's cheap, no harm done. To demonstrate progress to leadership we use metrics that best represent what we're trying to prove at each phase. We started with the following for the private preview of Jira Product Discovery: 10 active teams have been using the product for more than 3 months and plan to continue using it when we enter beta 10->100 customers (alpha) Then we progressively invite more customers to use the solution. At this stage we usually need to polish rough edges and address more scenarios based on what we learn from working with customers of varying needs, and maybe less willing to work with a rough prototype. It's also harder to collaborate with every customer 1-on-1 so we need to create better onboarding material, demo videos, etc. We move support to a community forum. Our success metric at that stage when creating Jira Product Discovery was still focused on problem-solution fit, but for more customers: Product market fit score of 40% or greater with 100 active teams (I highly recommend the Product market fit score survey, and you can read all about it here) 100->1000 customers (beta) At some point we become ready to share with more users, in our case for Jira Product Discovery it was when we reached 50% PMF score for more than 100 active teams. At that stage we needed to focus on making the solution fully self-service - we couldn't afford to have to talk to every customer once in their journey. So we focused on in-app onboarding, we improved usability, we polished the design, we scaled the technical implementation, we trained support and sales teams, etc. It's really about making it ready for prime time. It was also time to change how we measure success by introducing more metrics that represent the health of the funnel as more customers discover, try and adopt the solution on their own. That’s when we started adopting pirate metrics (AARRR), and here's a great read about them. In the early days of that stage we also focused on validating a pricing model, but that would be a topic for another post. Once we were done with all that, had high PMF score for 1000 customers, healthy conversion rates and retention, low churn, validated pricing - that's when we decided to make Jira Product Discovery generally available. These phases work for us - it doesn't mean they will work for you, but I believe that this general approach should apply to a lot of contexts. Basically: don't focus on "when the solution is ready to be shipped?" but "who should we be working with right now based on the state of readiness and validation of the solution?"
...Read More
2575 Views
Reid Butler
Cisco Director of Product ManagementDecember 20
This one is fairly easy, add value. We as Product Managers need to ensure that we are adding value for our organization by understanding the market (and our customers) and guiding the strategy to be successful in that market. It's easy to be a product expert, but we need to focus on being market and strategy experts. In my career, some key examples of adding value are: 1. Knowing My Market Better Than Anybody Else. When I am the expert on what our market needs, both short and long-term, I add significant value in defining and driving our strategy. My product can't be successful without this. When we are proven right in terms of our strategy definition and market validation, we win. 2. Build and Foster Relationships I work hard at establishing relationships around the organization where I am working. These enable me to be effective in cross-team collaborations and makes driving alignment across the organization easier. My relationships add value to me and my team. 3. Be an Expert When you are viewed as an expert and continually show your expertise in an area that is needed within the organization, it's easy to be seen as somebody who deserves that promotion. Show that your expertise drives direct value for your organization with clear successes.
...Read More
1267 Views
Nikita Jagadeesh
Google Product Lead - Google CloudJanuary 23
I currently work in the intersection of enterprise security & AI and it is incredible to see the use cases that have emerged for AI in this space. * User research: As I mentioned in one of the earlier questions, AI tools can be a fantastic source to understand user trends, market, and competitive trends. For example, you can take a look at online user reviews for your product to understand key functionalities and usability gaps. * Product functionality: Within security SaaS we often use the framework of detect, investigate, and resolve. AI is changing each of these experiences from a product development perspective. For example within ‘detect’ AI is enabling us to develop product experiences which help organizations more proactively understand attacks their orgs are more vulnerable to. Leveraging machine learning and external data sources we can provide scores to attacks to help understand how significant a vulnerability truly is. Within remediation AI helps to develop automated playbooks based on other similar playbooks that can help users more quickly resolve issues and get external data about how other orgs are resolving the issue. * AI experiences: In addition to augmenting the security workflow to make it more productive and effective, gen AI is also enabling us to create net new experiences for prospects and customers. For example if an organization doesn’t have the security skillset to complete one of the tasks across detect/investigate/resolve - what is the role AI could play here in filling the gap? How can AI be leveraged to empower shift left security in an organization so that developers are encouraged to incorporate security from the get go in their designs? There is so much potential for how AI can fundamentally change the product development process and excited to see all the innovation organizations bring to their products over the next two years.
...Read More
1251 Views
Jamil Valliani
Atlassian Vice President / Head of Product - AIDecember 20
I find that there are 3 basic traits that a team looks for a product manager to provide in any project: * Create Clarity - Does the product manager have the ability to disentangle the many signals a team may be sorting through and help everyone get aligned on a plan or point of view? * Generate Energy - Can the product manager effectively create momentum the team needs to get a project done? In early career stages, this is often the daily mechanics like running the stand-ups, prioritizing the bugs in a timely fashion, quickly and decisively resolving open questions and so on are all things that help a team build energy and momentum towards delivery * Deliver Results - It’s important to show that you have the ability to put points on the board - both individually and through leading your feature team. Knock out some items that have been on the backlog for too long, help see that stubborn feature thats been stuck in development for too long thru to delivery. If you can show to the team multiple examples of being able to do the above 3 core capabilities consistently and repeatably, I expect you’ll build trust and influence with your new team well.
...Read More
791 Views
Victor Dronov
Atlassian Group Product Manager, Trello EnterpriseDecember 20
visualization
Okay, you are not happy with your recent track record - good opportunity to flex your analysis, story telling and “marketer” muscle. * Goals. These features didn’t seem to support your goal - but they likely supported someone else’s (even if you were unhappy about it)? Demonstrate how you supported supported those goals or how you thought outside of the scope of your immediate team, to support some fellow team. * Learning. Did these features fail, though you knew this from the start? Focus your story on what your team and organization could learn shipping these features, and how you had a chance to apply this learning to do better in what followed. This could include external (customers) and internal (process, best practices) learning. * Leadership. You weren’t happy with what sound like a top-down request to build these features. Likely your team wasn’t thrilled either. Tell a story how you helped them to “disagree and commit”, stay motivated and deliver what just needed to be done.
...Read More
998 Views
Aindra Misra
BILL Group Product Manager - (Data Platform, DevEx and Cloud Infrastructure) ) | Formerly Twitter/XAugust 15
Your roadmap should have just enough details on the top level that will explain the below three things: * WHAT? * Summary of the problem, high level potential solution and the link to resources (documents, diagrams etc) * Why? * Value prop and mapping with the business goals and priorities * When? * Delivery time * It's great to break down the delivery time into smaller chunks and have clear milestones for the phases. The rest of the details and granularity should be out of the roadmap and into execution process/tools.
...Read More
541 Views
Yogesh Paliwal
Cisco Director of Product ManagementDecember 6
Many data-driven Product Management (PM) teams often overlook long-term strategic KPIs, such as Customer Lifetime Value (CLV), by focusing on short-term metrics like quarterly margins or one-off transactions. This approach can be detrimental, as retaining customers typically yields higher CLV and reduces churn-related costs. Another critical KPI often missed is Feature Discoverability and Time to Value. Despite having sophisticated features, users rarely utilize them due to: Difficulty Finding Features: Users struggle to locate necessary features. Longer Time to Realize Value: Understanding and realizing the benefits of these features often takes longer than competing alternatives. By prioritizing these long-term strategic KPIs, product teams can enhance adoption rates, accelerate customer value realization, and ultimately drive sustainable growth and customer loyalty.
...Read More
803 Views
Deepti Pradeep
Adobe Director of Product Management, GrowthFebruary 20
More often than not, an experiment fails. Failed experiments are the biggest sources of learning. Was your hypotheses proven wrong? Was the test design not pushing it far enough? Were there other data anomalies? Where there any positive signals in a subgroup? One needs to be careful in slicing the data by many cuts, as the validity of an AB test is largely dependent on its effect size. Cut it too much, and you might be cherry picking the data thereby losing its meaning for decision making, or worse you may lose yourself in a thousand cuts. I have found it best to align failed test reads directionally with a broader learning agenda and lean back on user testing and user research to come at the hypotheses in a few different ways. In fact, even if the test is a winner – I would still approach the hypotheses in different creative ways till you find paths that are optimal to users and the business. There is never an end to experimentation.
...Read More
580 Views