I work in analytics, but my focus is not dashboards, models, or tools. My work centers on decision clarity:
- Define the decision before analysis begins
- State constraints and trade-offs explicitly
- Remove metrics that do not change the outcome
- Design analysis to force a choice, not invite debate
I’m especially interested in how analytics fails in real organizations, not because of bad data, but because of unclear questions, misaligned incentives, and over-analysis where judgment is required.
Decision-Centric Analysis (When Dashboards Fail): https://github.com/ifiok-ebong/decision-centric-analytics
The Cost of the Wrong Question (Framing error case study): https://github.com/ifiok-ebong/the-cost-of-the-wrong-question
- Analytics exists to reduce uncertainty around a decision, not to explain everything
- More metrics often increase confusion, not insight
- If a result cannot be explained verbally in two minutes, it is not decision-ready
- The hardest part of analytics is not computation, it is framing
Each project is designed to demonstrate judgment, not technical breadth.
-
Decision-Centric Analysis: When Dashboards Fail
Core question: Why do well-built dashboards still fail to drive action? -
The Cost of the Wrong Question: A Framing Error Case Study
Core question: How do good analysts produce bad outcomes by answering the wrong question? -
Signal vs Noise: Identifying Metrics That Matter (planned)
Core question: Which metrics actually change decisions, and which ones decorate them?
- Exhaustive dashboards
- KPI catalogs
- Predictive models for their own sake
- Tool demonstrations without decision context
- Strategic insights and BI decision support
- SaaS retention and revenue analytics
- Strategy and operations analytics
If you want analytics that prioritizes judgment over output, you will likely find these projects relevant.