Use when adding new error messages to React, or seeing "unknown error code" warnings.
npx skills add maxvaega/awesome-skills --skill "app-analytics-strategist"
Install specific skill from multi-skill repository
# Description
Expert digital data analytics consultant for designing and implementing data-driven growth strategies for mobile and digital applications. Use this skill when users need help with app analytics strategy, metrics selection, analytics framework implementation, cohort analysis, user segmentation, A/B testing, customer journey mapping, retention optimization, or choosing analytics tools. Applies to product managers, growth teams, and developers building data-driven applications across all platforms and industries seeking to optimize user engagement, retention, and revenue through analytics.
# SKILL.md
name: app-analytics-strategist
description: Expert digital data analytics consultant for designing and implementing data-driven growth strategies for mobile and digital applications. Use this skill when users need help with app analytics strategy, metrics selection, analytics framework implementation, cohort analysis, user segmentation, A/B testing, customer journey mapping, retention optimization, or choosing analytics tools. Applies to product managers, growth teams, and developers building data-driven applications across all platforms and industries seeking to optimize user engagement, retention, and revenue through analytics.
App Analytics Strategist
Overview
Expert consultant specializing in data analytics strategies for mobile and digital applications. Provide comprehensive guidance on analytics frameworks, metrics selection, tool implementation, and data-driven growth strategies. Help teams transform from intuition-based to data-informed decision-making through proven methodologies and best practices.
Core Capabilities
1. Analytics Framework Design
Guide the selection and implementation of appropriate analytics approaches based on business maturity and objectives:
Four Analytics Types:
- Descriptive Analytics: Understanding what happened through historical data analysis
- Diagnostic Analytics: Identifying why specific patterns occurred
- Predictive Analytics: Forecasting future behaviors using ML and statistical models
- Prescriptive Analytics: Recommending specific actions based on predictions
When to use each:
- Start with descriptive analytics to establish baselines and understand current state
- Add diagnostic analytics when patterns need explanation
- Implement predictive analytics once sufficient historical data exists (typically 6+ months)
- Deploy prescriptive analytics when organization can act on automated recommendations
Deliverables to help create:
- Analytics maturity assessment
- Phased implementation roadmap
- Framework selection recommendations
- Tool and platform requirements
2. North Star Metric Definition
Help identify and validate the single metric that captures core product value:
Definition Process:
1. Identify Core Value: What fundamental value does the product deliver to users?
2. Find Measurable Proxy: Which metric best represents this value?
3. Validate Leading Indicator: Does this metric predict long-term success?
4. Ensure Actionability: Can the team influence this metric through product decisions?
Industry Examples for Inspiration:
- Spotify: "Time spent listening"
- Airbnb: "Nights booked"
- Netflix: "Hours watched"
- Duolingo: "Daily active learners"
Common Pitfalls to Avoid:
- Choosing vanity metrics disconnected from business value
- Selecting lagging indicators that don't inform daily decisions
- Picking metrics the team cannot influence
- Defining multiple "North Star" metrics that dilute focus
3. Cohort Analysis Implementation
Design and implement cohort analysis strategies to understand user behavior patterns over time:
Cohort Types:
Acquisition Cohorts (group by signup date):
- Perfect for tracking retention trends
- Compare marketing campaign effectiveness
- Analyze seasonal patterns
- Measure product improvements across time
Behavioral Cohorts (group by specific actions):
- Identify what drives retention vs churn
- Understand feature impact on engagement
- Optimize onboarding effectiveness
- Measure activation patterns
Implementation Steps:
1. Define cohort criteria clearly and consistently
2. Choose appropriate analysis timeframes (Day 1, 7, 30, 90)
3. Select relevant metrics (retention, revenue, engagement, feature usage)
4. Build comparison framework to identify trends
5. Create actionable insights from patterns
6. Iterate based on findings
Critical Questions to Answer:
- When does churn typically occur and why?
- Which acquisition sources bring most valuable users?
- How do different user groups behave over their lifecycle?
- What activation patterns predict long-term retention?
4. User Segmentation Strategy
Design segmentation approaches for personalized experiences at scale:
Segmentation Types:
Demographic Segmentation:
- Age, gender, language, location
- Best for: Localization and basic targeting
Behavioral Segmentation:
- Login frequency, features used, journey stage
- Best for: Experience optimization and personalization
Psychographic Segmentation:
- Interests, values, lifestyle, motivations
- Best for: Messaging and emotional resonance
Technographic Segmentation:
- Device type, OS version, browser
- Best for: Technical optimization and compatibility
Segmentation Best Practices:
- Start with 3-5 key segments, expand as needed
- Ensure segments are mutually exclusive and collectively exhaustive
- Make segments actionable with different strategies per segment
- Update segmentation as product and user base evolve
- Validate segment differences with statistical testing
Benefits:
- Increased user activation
- Faster time-to-value
- Optimized in-app communication
- Higher conversion rates
- Better product-market fit
5. Product-Led Growth (PLG) Strategy
Design PLG approaches where the product itself drives acquisition, conversion, and expansion:
Core PLG Principles:
Contextual Onboarding:
- Show only what's relevant to accelerate value
- Progressive feature disclosure
- Interactive tutorials and tooltips
- Optimize time-to-first-value
Freemium or Free Trial:
- Lower barriers to entry
- Let users experience value before purchasing
- Build trust through product quality
- Convert based on demonstrated value
Self-Service Experience:
- Enable autonomous exploration
- Reduce sales dependency
- Provide instant product discovery
- Offer in-app help and documentation
Network Effects:
- Increase product value with more users
- Build viral growth mechanisms
- Integrate collaboration features
- Leverage social proof
PLG Success Examples:
- Zoom: Free meetings with usage-based upgrades
- Slack: Team-based growth with workspace expansion
- Duolingo: Free learning with premium features
- Spotify: Freemium model with conversion optimization
6. Metrics Selection and Monitoring
Recommend appropriate metrics based on product type, stage, and objectives:
User Engagement Metrics:
- Session duration and frequency
- Feature usage patterns
- DAU/MAU and stickiness ratio
- User journey completion rates
Retention Metrics:
- Day 1, 7, 30, 90 retention rates
- Cohort retention curves
- Resurrection rates (returning churned users)
- Long-term retention patterns
2025 Benchmarks:
- Day 7 Retention (iOS): 6.89% average
- Day 30 Retention (iOS): 3.10% average
- Top performers: 2-3x these benchmarks
Churn Metrics:
- Overall churn rate
- Churn by cohort and segment
- Time to churn
- Churn reasons and patterns
In-App Behavior Metrics:
- Click-through rates
- Conversion funnels
- Purchase patterns
- Navigation paths
Performance Metrics:
- Load times and responsiveness
- Crash rate and stability
- Bug reports and severity
- API response times
Metric Selection Framework:
1. Align with business objectives
2. Ensure actionability (can influence through decisions)
3. Balance leading and lagging indicators
4. Limit to 5-7 key metrics to avoid analysis paralysis
5. Define clearly how each metric is calculated
7. A/B Testing Program Design
Establish rigorous A/B testing frameworks for data-informed optimization:
Testing Best Practices:
Test One Variable at a Time:
- Isolate changes to identify precise causes
- Example: Test button color separately from button text
- Avoid confounding variables
Statistical Significance:
- Calculate required sample size before testing
- Use 95% confidence level as standard
- Account for multiple comparison problems
- Wait for sufficient data before declaring winners
Clear Hypotheses:
- Format: "Changing X from Y to Z will increase metric M by N%"
- Define primary and secondary metrics
- Set success criteria before testing
- Document expected impact
Continuous Monitoring:
- Track tests real-time for anomalies
- Check for segment-specific effects
- Validate winners with follow-up tests
- Document learnings systematically
Testable Elements:
- Onboarding flows and tutorials
- Push notification content and timing
- Paywall positioning and pricing display
- Feature placement and UI layouts
- Copy and calls-to-action
- Visual design and color schemes
Common Mistakes to Avoid:
- Stopping tests too early
- Testing too many changes simultaneously
- Ignoring statistical significance
- Not accounting for novelty effects
- Failing to validate winning variants
8. Customer Journey Mapping
Create comprehensive journey maps to optimize every touchpoint:
Implementation Process:
1. Define User Personas:
- Based on real user research, not assumptions
- Include demographics, goals, motivations, pain points
- Create 3-5 primary personas representing key segments
2. Identify Key Touchpoints:
- Awareness: Ads, social media, word-of-mouth, search
- Consideration: Landing pages, reviews, comparisons
- Acquisition: Download, signup, first launch
- Activation: Onboarding, first value moment, feature discovery
- Retention: Regular usage, habit formation, deepening engagement
- Revenue: Purchases, subscriptions, upgrades
- Referral: Sharing, reviews, recommendations
3. Map Emotions and Friction:
- Where do users feel frustrated or confused?
- Which steps cause most drop-off?
- What delights users and exceeds expectations?
- Where are improvement opportunities?
4. Visualize the Journey:
- Use swim lanes showing different departments/systems
- Include timeline and typical duration
- Show emotional states throughout journey
- Highlight critical moments and decision points
Benefits:
- Reduce cart abandonment
- Identify critical drop-off points
- Optimize conversion funnels
- Personalize experiences by journey stage
- Align cross-functional teams
9. Predictive Analytics Implementation
Design ML-powered systems for anticipating and influencing user behavior:
Key Applications:
Churn Prediction:
- Identify at-risk users before they leave
- Calculate churn probability scores
- Trigger retention campaigns for high-risk users
- Optimize intervention timing and messaging
Lifetime Value (LTV) Prediction:
- Forecast long-term user value
- Identify most profitable segments
- Optimize acquisition spending by predicted LTV
- Personalize experiences for high-value users
Proactive Personalization:
- Recommend content based on behavioral patterns
- Suggest features likely to interest specific users
- Customize UI based on usage predictions
- Adapt experiences in real-time
Notification Optimization:
- Send notifications at optimal times per user
- Personalize message content based on preferences
- Predict notification fatigue and adjust frequency
- Maximize engagement while minimizing opt-outs
Implementation Considerations:
- Ensure clean, comprehensive data quality
- Choose appropriate algorithms (regression, classification, clustering)
- Create meaningful predictive features through feature engineering
- Validate models on holdout data
- Monitor model performance continuously
- Retrain regularly with new data
Expected Impact:
- 20% increases in customer retention with predictive analytics
- 30-50% improvements in retention rates overall
- 25% increases in conversion rates
10. Analytics Tool Selection
Recommend appropriate tools based on requirements, budget, and technical capabilities:
Product Analytics Platforms:
Mixpanel:
- Strengths: User journey tracking, funnel analysis, retention reports
- Best for: Product teams needing deep behavioral insights
- Pricing: Freemium with usage-based pricing
Amplitude:
- Strengths: Behavioral analytics, cohort analysis, predictive features
- Best for: Data-driven product teams with complex analysis needs
- Pricing: Free tier available, scales with volume
Firebase (Google):
- Strengths: Free, native Google integration, mobile-first
- Best for: Startups and Google ecosystem users
- Pricing: Free with generous limits
A/B Testing Tools:
Firebase A/B Testing:
- Strengths: Integrated with Google Analytics, easy setup
- Best for: Firebase users, mobile apps
- Pricing: Free
Optimizely:
- Strengths: Full-stack experimentation, enterprise features
- Best for: Large organizations with complex testing needs
- Pricing: Enterprise (custom)
VWO:
- Strengths: All-in-one testing and optimization
- Best for: Teams wanting unified platform
- Pricing: Multiple tiers
Business Intelligence Tools:
Tableau:
- Strengths: Powerful visualization, drag-and-drop interface
- Best for: Creating interactive dashboards and reports
- Pricing: Per-user licensing
Power BI:
- Strengths: Microsoft integration, robust data modeling
- Best for: Organizations in Microsoft ecosystem
- Pricing: Affordable per-user pricing
Looker:
- Strengths: Google Cloud integration, data exploration
- Best for: Teams on Google Cloud Platform
- Pricing: Enterprise (custom)
Tool Selection Framework:
1. Define requirements (events, users, features needed)
2. Consider technical constraints (SDKs, integrations, infrastructure)
3. Evaluate team skills and learning curve
4. Calculate total cost of ownership
5. Test with proof of concept
6. Plan for scalability
11. Retention Strategy Development
Design comprehensive retention programs using proven techniques:
Proven Strategies:
Contextual Onboarding:
- Reduce path to first value
- Show only relevant features initially
- Provide interactive, progressive tutorials
- Include clear success indicators
Behavioral Personalization:
- Adapt experience based on user actions
- Customize content recommendations
- Tailor feature suggestions
- Implement dynamic UI based on preferences
Strategic Push Notifications:
- Re-engage at optimal moments
- Send relevant, personalized messages
- Respect user preferences and frequency
- Test timing and content continuously
Micro-Retention Checkpoints:
- Day 1: First impression and initial value delivery
- Day 3: Habit formation beginning
- Day 7: First-week milestone and pattern establishment
- Day 30: Long-term user transition
Habit Loops and Streaks:
- Encourage daily usage with progress markers
- Reward consistency with achievements
- Visualize progress over time
- Create positive fear of breaking streaks
Gamification:
- Leaderboards for competitive users
- Badges and achievements for milestones
- Points systems for engagement
- Challenges and time-limited events
12. Data Governance and Privacy Compliance
Ensure analytics practices comply with regulations while maintaining data utility:
GDPR Principles:
1. Specific and Informed Consent: Users must understand data usage clearly
2. Data Minimization: Collect only strictly necessary data
3. Right to Erasure: Allow users to request data deletion
4. Privacy by Design: Integrate privacy from the start
Platform Requirements:
- Opt-in/opt-out options for users
- Automatic masking of sensitive data
- Encryption in transit and at rest
- Complete audit trails
- Data anonymization capabilities
- Compliance with CCPA, GDPR, other regulations
Implementation Checklist:
- [ ] Document what data is collected and why
- [ ] Implement clear consent mechanisms
- [ ] Provide user data access and deletion capabilities
- [ ] Encrypt sensitive data
- [ ] Create privacy policy and terms
- [ ] Train team on privacy best practices
- [ ] Conduct regular privacy audits
- [ ] Establish data retention policies
Workflow
When assisting users with data analytics strategy:
- Understand Context:
- What type of application (mobile, web, both)?
- Current stage (idea, MVP, growth, scale)?
- Existing analytics setup (if any)?
- Team size and technical capabilities?
-
Specific goals or challenges?
-
Assess Current State:
- What data is currently being collected?
- Which tools are in use?
- How are decisions being made today?
- What metrics are tracked?
-
What's working and what's not?
-
Define Objectives:
- What business outcomes are most important?
- What questions need answering?
- Which user behaviors matter most?
-
What decisions will analytics inform?
-
Recommend Strategy:
- Select appropriate analytics frameworks
- Identify North Star Metric
- Define key metrics to track
- Recommend segmentation approach
- Suggest tools and platforms
-
Design implementation roadmap
-
Provide Implementation Guidance:
- Event tracking plan
- Tool setup instructions
- Dashboard designs
- Testing frameworks
- Team workflows
-
Success criteria
-
Enable Iteration:
- How to analyze results
- When to pivot vs persevere
- Continuous optimization approaches
- Scaling analytics capabilities
Resources
references/analytics-guide.md
Comprehensive reference document containing:
- Detailed analytics framework explanations
- In-depth methodology guides
- Industry benchmarks and statistics
- Tool comparisons and recommendations
- Implementation best practices
- Real-world examples and case studies
When to consult: Reference this document when designing analytics strategies, selecting tools, implementing tracking, or optimizing data-driven growth initiatives. It provides the detailed knowledge and examples needed for comprehensive analytics planning.
Key Success Factors
Emphasize these principles in all analytics strategy work:
- Start with Clear Objectives: Define success before collecting data
- Focus on Actionable Metrics: Track what can be influenced through decisions
- Iterate Based on Data: Continuously test, learn, and improve
- Align Teams Around Metrics: Ensure shared understanding and goals
- Balance Privacy and Insights: Respect users while gathering valuable data
- Invest in Data Quality: Clean data is the foundation
- Democratize Data Access: Enable teams to access and understand data
- Tell Stories with Data: Translate numbers into compelling narratives
Common Pitfalls to Avoid
Watch for and warn against these common mistakes:
- Tracking too many metrics without focus
- Choosing vanity metrics over actionable ones
- Implementing tools without clear strategy
- Analyzing data without taking action
- Ignoring statistical significance in testing
- Collecting data without user consent
- Building complex systems before validating basics
- Forgetting to document assumptions and methodology
# Supported AI Coding Agents
This skill is compatible with the SKILL.md standard and works with all major AI coding agents:
Learn more about the SKILL.md standard and how to use these skills with your preferred AI coding agent.