Is analytics a necessary evil rather than a real value driver?

I’ve been working in analytics and advanced analytics for about 6 years. I started in a large consultancy and later went solo, so I’ve seen both enterprise and smaller product teams up close.

Something keeps bothering me. In most projects, analytics feels like infrastructure that no one is genuinely excited about. People rarely want to invest in it when building a product. It’s treated as something you should have, not something you want to have.

Teams are “happy” to pay for software development, advertising, copywriting, design. Those are seen as directly useful. Analytics (GA4, event tracking, or even more structured setups like CDPs) is often perceived as background noise, necessary to keep the engine running, but not something that meaningfully moves the product forward day to day.

In practice, many teams end up using only a handful of metrics to make decisions, even when a complex analytics stack exists underneath. The rest is there “just in case.”

I’m curious whether others see the same pattern. Is analytics undervalued because its ROI is indirect and delayed? Or is most analytics work simply over-engineered for the actual decisions teams make? At what point does analytics shift from “necessary plumbing” to a real competitive advantage?

Would love to hear perspectives from founders, engineers, and product folks who’ve built and scaled things.

6 points | by tiazm 17 hours ago

2 comments

  • PaulHoule 17 hours ago
    So I did a lot of business development in the 2010s in a space that involved: semantic web, lo/now code, schema-driven development, business rules, entity matching, "centaur" systems where people work together with ML systems to do work, etc.

    There was the obvious choice of "analytics oriented" or "LoB oriented" with the complication that "centaur" and anything subjective like "entity matching" needs some analytics no matter what.

    My take now is that you're basically right: overall the spend on LoB is bigger because it right on the path to delivering value whereas analytics are secondary... if you're going to get any value out of analytics you're still going to have to execute in the LoB to realize that value!

    On the other hand, analytics might be an easier sell because the analytics system can be dropped on top of what's there and the "low code" capabilities could efficiently accelerate the process. Whereas, "rip and replace" on the LoB would be a huge commitment, anything missing from the new system is a dealbreaker, and if it has to interface with the old system the old system is likely to diffuse the benefits of low code. (with the caveat that maybe a framework that implements "strangler fig" might break the impasse)

    One thing that was seductive at the time was being saturated with ads and conferences and sponsored blog posts and such about analytics, but you have to realize this: if something is heavily advertised people want to sell it, not buy it That is, advertising is a bad smell.

    • tiazm 15 hours ago
      I like the analytics vs LoB oriented framing. In my experience, analytics is necessary to evaluate quality and guide change, but value only materializes once decisions are executed in the LoB systems. That mismatch makes analytics easy to layer on, but hard to own.

      What I keep wondering is whether there will ever be a point where analytics is perceived as first order value from day one, rather than as something you add later once the system already exists.

      • PaulHoule 14 hours ago
        Well, in retrospect I think had we realized the system we were thinking about it might have been what you say, a "low/no code" system which has analytics baked into it at the core with the intention of always running with closed feedback loops -- I think it matters more than ever in the age of AI.

        Send me an email and I can share you some slide decks from that time period.

  • lbhdc 17 hours ago
    That is interesting. I have worked in the space too, and have seen similar attitudes.

    I think it often comes down to who is responsible for making decisions with that data. If a product or business person is the one driving a feature, and looking for adoption, the engineers likely aren't going to be invested in building out sophisticated metrics. They get the metrics they are responsible for from their cloud provider (resource use/latency/scale).

    I think that problem is compounded by the perception that these integrations are going to tank your products perf (may hurt the metrics engineers care about).

    I think all of those dynamics change in really big companies with thousands of engineers. Then you can often end up in a situation where engineers are now required to maximize product metrics, and need visibility into their small slice of the pie.

    So, I think its largely incentive, which is why I see all of the metrics vendors targeting product and sales people in small/mid sized companies.

    • tiazm 15 hours ago
      That’s an interesting point. One thing I’ve noticed, though, is that even the people who are directly exposed to incentives (usually product and marketing) tend to focus almost exclusively on the final KPIs they’re measured on, like revenue or conversion rate.

      Because of that, the analytics layer is often seen as something secondary. As long as the top line numbers are moving, there’s little perceived urgency to invest in a structured analytics foundation that explains why those numbers move.

      So even when incentives exist, they’re often too outcome focused. Analytics that helps understand mechanisms, not just results, struggles to justify itself until something breaks or growth stalls.

      • lbhdc 11 hours ago
        I think that is because they are being judge by their outcomes.

        In the space I was in (ads) users were highly mistrustful of the data. They felt everything was kind of fuzzy (eg how well are you measuring unique users and their actions).

        They would end up using multiple vendors (and we would have to spend a lot of time comparing and contrasting results). They really really want "apples to apples" comparisons.

        At the end of the day they were trying to answer, does what I am spending my money on give me the results the business needs? To your point there is a lot of nuanced data, but their bosses definitely only cared about the top line, did it move the needle.