Vibe Analytics: Vibe Coding’s New Cousin Unlocks Insights
The Excel spreadsheet era asked, “What happened?” The dashboard era asked, “Why did it happen?” The vibe era asks, “What insights emerge if we explore together?” Consider the gains that real-world companies are already making.
Topics
News
- Indian IT Recasts Talent Strategy as AI Reshapes Operations
- TCS to Shed 12,000 Jobs This Fiscal, Says AI Isn't the Reason
- CoverSure Partners with Jupiter to Bring AI-Powered Insurance Clarity
- OpenAI's GPT-5 Is Coming, But AGI May Have to Wait
- Google AI Cracks Gold at Maths Olympiad with Gemini Deep Think
- OpenAI Chief Sees Fraud Crisis Brewing as AI Clones Identities

Editor’s note: The author used GenAI tools as resources in the process of writing this article.
IN FEBRUARY, former Tesla AI director and OpenAI cofounder Andrej Karpathy posted a casual humblebrag on X about his clever weekend coding approach: “There’s a new kind of coding I call ‘vibe coding,’ where you fully give in to the vibes, embrace exponentials, and let LLMs write the code for you.” Karpathy’s hack literally and figuratively has prompted a profound and provocative disruption in software design and development. “It’s not really coding,” he observed. “I just see stuff, say stuff, run stuff, and copy-paste stuff, and it mostly works.” It now seems almost certain that Karpathy’s AI-infused “stuff-ification” will have a significant impact on other forms of knowledge work—especially data-driven analysis.
What vibe coding does for (and to) programming, vibe analytics does for (and with) data. Vibing with structured or unstructured data — comma-separated value (CSV) files, SQL databases, Excel spreadsheets, or transcribed Zooms — can generate empirical insights that neither legacy analytics nor data science are likely to produce. Much like vibe coding transforms programming from syntax to intention, vibe analytics transforms data queries to conversations, and dashboards to jam sessions. In these contexts, vibe describes an alignment between human intent and machine interpretation that emerges through improvisational interaction.
The improvisation evokes and enables actionable insight; insight comes from the data effectively and affectively arguing back. In the spirit of early statistics pioneers (notably, John Tukey and Frederick Mosteller, whose exploratory data analysis approach encouraged data to speak before formal models imposed meaning), vibe analytics pursues value via dynamic interaction — where data engages, responds, and sometimes resists. It’s not just pattern recognition; it’s pattern negotiation.
So where the Excel spreadsheet era asked, “What happened?” and the dashboard era asked, “Why did it happen?” the vibe era — beginning with code but inevitably expanding to data — asks, “What emerges if we explore together?” For executives, this means faster time-to-insight and the democratization of analytical powers: The need to wait days or weeks for data teams to translate business questions into technical queries vanishes. When leaders engage directly with messy/structured/unstructured data through improvised multimodal exploration to surface unexpected patterns and accelerate decision-making, they are innovating out loud.
As I witnessed firsthand, the value of vibe analytics materializes quickly when organizations seeking quick-win/high-impact insights run “promptathons” and other proof-of-concept initiatives using large language models and generative AI tools. A promptathon is like a hackathon, but participants use prompts instead of code to competitively and collaboratively innovate.
Earlier this year, I worked with business leaders in vibe analytics sessions in classrooms and boardrooms held during MIT Sloan Executive Education classes and the MIT Industrial Liaison Program (ILP). In the former, business leaders explored how generative and predictive AI capabilities might transform their KPIs from static, retrospective metrics to intelligent software agents that could “vibe-analyze” data sets from spreadsheets and SQL databases. These leaders wanted to make their KPIs smarter. In the ILP, companies sent innovation teams to workshop vibe-analytics techniques as part of their efforts to drive more value using technology. These teams, whose members ranged from startup execs to Fortune 100 financial planning and analysis office professionals, explored and embraced vibe approaches that radically altered their internal insight economics.
Vibe Analytics in Action: Three Companies’ Experiences
How does vibe analytics play out, and what gets produced? Let’s explore three examples. First, one of Southeast Asia’s largest telecommunications companies vibe-analyzed its way to a fundamental rethinking of its multibillion-dollar B2B product portfolio. Account executives, marketing leads, financial analysts, and product managers together prompted generative AI tools, using company budgets, analysts’ reports, product road maps, and service-level agreement commitments — to identify mispriced offers, reveal previously unidentified opportunities to bundle and/or upsell service packages, and negotiate better SLAs.
In less than 90 minutes of LLM-mediated interactions with anonymized and nonproprietary data, the group preliminarily identified and segmented the company’s undervalued/underperforming and costly SLAs, realized that the team needed to formally and more frequently analyze customer log data, and found that it needed to tune LLMs to help account teams recommend custom product bundles to retain and upsell business clients. The most important vibe-analytics-generated insight? A novel SLA scoring and ranking system that shows the entire company what kinds of contracts and commitments correlate with higher margins, higher risks, and unexpectedly higher support costs. The session’s executive owner — a technologist who oversees the company’s product portfolio — observed that more financially relevant data-driven insights were surfaced in that mediated 90-minute session than the organization typically generates in 90 days.
Second, a cybersecurity software-as-a-service unicorn company offered its Excel database of 30,000-plus “freemium” customers as a sample data set to frame and focus a revenue-minded analytics promptathon. That spreadsheet was dropped directly into an LLM context window with no cleanup and only minor metadata tweaks. After a few brief but pointed stress-test prompts, the responses were forwarded to a revenue operations executive for review. His reply: As a first iteration, the initial insights were OK, but what he really wanted were insights that would address particular pain points and problems that the freemium customers created for his team. I essentially turned that email directly into a follow-up prompt and forwarded the resulting vibe analysis back. His reaction: “These are really interesting; we hadn’t thought about [it] that way.” To be clear, his incisive, initial response led to actionable insights that he and his team hadn’t yet considered.
The company’s legacy net promoter score, derived from customer satisfaction scoring systems and surveys, didn’t vibe. Iterative improvisation enabled innovative insights.
In the third case, a team at a financial services company quickly transformed hundreds of customer support contact center transcripts into an interrogatory/improvisational playground. Small teams of support people, technologists, marketers, HR employees, and compliance representatives — but no data scientists or employees from business intelligence (BI) — vibed with the data using an LLM-based template. One team decided to identify circumstances where members of the support staff were less happy than the customers; another sought to identify which transcripts were likely to be the best customer churn predictors; and a third group sought to revise and rewrite the scripts agents used to alternately mollify and upsell customers.
The most important collective takeaway: Legacy customer segmentation analytics had failed to identify — let alone capture — the variety of personas surfaced by these improvised and iterated-upon sentiment analytics. The team commented that their company simply didn’t know its customers well enough; the company’s legacy net promoter score, derived from customer satisfaction scoring systems and surveys, didn’t vibe. Iterative improvisation enabled innovative insights.
In all three examples, the key takeaway of the group work was not best-practice templates or the recognition that legacy/traditional analytics practices often needed improvisational boosts — but that attendees emphatically wanted new ways to engage with data and each other. Collaborating with colleagues mattered as much as collaborating with data. That was a surprise to their managers up the chain: As much as people wanted insights and answers, they also looked to vibe analytics for alignment and coordination.
What Is Vibe Analytics?
Legacy BI and business analytics work has been constrained by technical silos and divides, and work has historically happened in this order:
•Define the question.
•Structure the query.
•Execute models.
•Visualize the results.
•Interpret the dashboard.
This classic pipeline segregates the question-askers from the people with the statistical/technical/data science skills and training to answer them. Even when the two sides meet regularly, translation problems occur and accrue when people lack a common analytics vocabulary. All too often, business leaders are made to wait while the data scientists translate perceived intent into SQL, R, or Python — a translation layer that invariably blurs details and limits just-in-time exploration and interactions.
Vibe analytics, in stark contrast, collapses this chain as ruthlessly as vibe coding collapsed the software development chain of events
•Express the intent.
•Observe the results.
•Refine the prompt.
•Discover patterns.
•Evolve your understanding.
When executives can ask, “What’s happening with our conversion rates?” and immediately explore potential causes through improvisational dialogue, vibe analytics fundamentally alters not only how answers are obtained but also how knowledge itself is generated. This is virtually identical in spirit and in kind to how vibe coding transforms software creation.
But describing vibe analytics as a radical enhancement of legacy BI practice represents a category error on par with calling vibe coding an upgrade to traditional development. Both represent a fundamental, disruptive shift — from “knowledge as artifact” to “knowledge as basis for improvisation.”
Five Example Applications of Vibe Analytics
Vibe analytics as an organizing and operating principle creates a wealth of pragmatic opportunities for companies to revisit and rethink how their data sets can be mashed up, modified, and blended to invite innovation. Consider these vibe analytics approaches as improvisational instruments for actionable insights.
Turn KPIs Into Conversational Partners
•Transform static metrics into dynamic dialogues that reveal hidden assumptions underlying the numbers.
•Ask about your churn rate: “What customer behaviors am I missing that predict departure six months out?”
•Challenge your customer acquisition cost: “Which acquisition channels create customers who stay the longest and spend the most?”
•Treat metrics as “thinking agents” that can help explain their own fluctuations and suggest improvements and interventions.
Treat Transcripts as Living Customer Testimony
•Surface emotional volatility patterns — sudden shifts from frustration to satisfaction that signal relationship turning points.
•Create sentiment heat maps across customer segments to identify which groups and personas experience the steepest emotional drops.
•Discover “negative space” topics — what customers systematically avoid mentioning but should address (such as competitor comparisons or pricing concerns).
•Evaluate not just explicit complaints but emotional undertones, using conversational prompts to elicit what customers might really mean beneath their stated concerns.
Synthesize Disconnected Data Stories
•Ask messy multidepartmental CSV files, “What’s our most underdiscussed leading indicator of growth or decline?”
•Let LLMs blend disparate data threads from sales, support, marketing, and operations into coherent narratives that reveal cross-functional patterns.
•Identify pattern resonance — how insights from one data set amplify or contradict findings from another — to avoid overemphasizing isolated pattern recognition within single data sources.
Debug Business Assumptions in Real Time
•Prompt: “What assumptions are we making by framing this issue this way? What alternate framings might evoke different insights?”
•Challenge: “What historical biases infect this metric, and how might that blind us to emerging realities and threats?”
•Iterate on both data insights and the mental models that generate them. Treat assumptions as testable hypotheses rather than reasonable givens.
Improvise With Data Personas
•Ask your forecast data, “If I were a skeptical board member reviewing this, what three questions should I ask first?”
•Role-play scenarios: “Speaking as our most price-sensitive customer segment, what would push you to switch to a competitor?”
Data Quality: The Required Foundation
A big caution: Organizations expecting to reliably improvise insights must bluntly revisit and reassess their data integrity. There is no GIVO (garbage in, vibes out) loophole. Questionable data quality doesn’t just limit traditional analytic effectiveness — it can lead to improvisational explorations that deceptively mislead. When LLMs synthesize patterns from inconsistent, incomplete, or biased data sets, they can amplify errors in ways that feel convincingly coherent.
Ironically, this LLM-fabricated coherence can prove to be helpful in identifying the root causes of flawed data sets. At one promptathon, for example, a team kept generating customer insights from data sets that everyone agreed were “crazy” and made no sense. Yet here was the LLM insisting that the insights were accurate and data-driven. This prompted the team to investigate the data sets they used for their vibed analyses. Unsurprisingly, they found that one of the customer data sets was rubbish but still relevant to a particular use case. This discovery heightened the company’s consciousness about quality data and metadata.
That means that companies with data quality issues should see vibe analytics as more aspirational provocateur than precision interrogator. But those provocations will likely strengthen the company’s formal data analytics processes. Mature data environments — where governance, lineage, and quality controls have been established — offer ideal conditions for vibe analytics to flourish. Improvisational approaches can actually help identify data quality issues that traditional queries might miss, turning exploration into a form of data discovery and validation. And if your data environment is not mature, aspiring vibe analysts need to narrow their focus, throttle back their ambitions, and be transparent about how reliable (or unreliable) their proffered insights may be.
Companies with data quality issues should see vibe analytics as more aspirational provocateur than precision interrogator.
Leaders can also consider the middle-ground approach, which involves selective and/or targeted applications: using vibe analytics only on data sets with moderate to high levels of confidence in quality while simultaneously working to improve data quality across the broader organization. This approach allows teams to build analytical intuition and capability while addressing foundational data challenges — a win-win.
Now, the second caution: Vibe analytics must also operate within demonstrably robust security and governance frameworks that protect your company’s sensitive data while enabling exploration. Not all data sets are suitable for improvisational exploration. Personally identifiable customer information, financial details, and proprietary information require strict access controls and likely shouldn’t be used in vibe analytics sessions without anonymization, aggregation, or synthetic data generation. Similarly, LLM interactions with enterprise data should proceed in secure, controlled environments. This could mean private cloud deployments, on-premises solutions, or carefully vetted third-party platforms with appropriate data-processing agreements and security certifications.
While vibe analytics embraces improvisation, organizations need mechanisms to track what questions were asked, what data was accessed, and what insights were generated. This creates accountability without stifling exploration by your innovators. The goal should never be to bureaucratize improvisation but to ensure that there are safe spaces where teams can explore freely while respecting the organization’s security responsibilities and compliance requirements.
Redefining Analytical Rigor, Not Abandoning It
Culturally, the vibe analytics approach can be controversial, and leaders should expect pushback from some traditional data experts. One predictable line of complaint is that vibes are rigor-free and the purported analytics are dangerously informal. These criticisms are typically unfounded and untrue. The term vibe misleads. Rigor isn’t abandoned in vibe analytics; it’s reframed and redefined. It is my observation that these misperceptions stem from how vibe analytics presents insights — in clean forms that mask the sophisticated computational synthesis. Skeptics may mistake clarity of output for simplicity of process: This is like judging quantum equations by their elegant final form rather than the complexity they resolve.
The true power of vibe analytics emerges from iterative dialogue: structured improvisation that amplifies but does not replace critical thinking. Indeed, vibe analytics interactions stimulate critical thinking and argument. Vibe analytics encourages more intentional improvisation.
Vibe analytics invites rigor more through provocation than procedure.
Vibe analytics invites rigor more through provocation than procedure. Instead of following predetermined analytical pathways, practitioners need to systematically question and challenge assumptions, explore alternative framings, and test insights against counterexamples and counterfactuals. This approach can help teams credibly surface blind spots, confounding the variable and tacit assumptions that legacy analytics approaches often miss precisely because those approaches stick to predefined hypotheses.
This newer model, with its freedom of responsive dialogue, requires appropriate frameworks to prevent analytical sloppiness. As one senior financial BI analyst at the Asian telecom company remarked after their session, “Many of these insights are very good … but we are still going to have to [formally] assess them.” Of course, that follow-up reality check is less a bug than a feature: It’s a natural workflow where improvisational discovery feeds into rigorous validation.
In fairness, parallels between vibe coding and vibe analytics extend to their limitations. Vibe coding faces legitimate concerns about cybersecurity vulnerabilities, maintenance challenges as code bases grow, and limitations in handling extreme complexity. Similarly, vibe analytics must address questions of reproducibility, bias amplification, and governance. That said, these challenges aren’t fatal flaws — they’re design parameters. Teams that improvise to create collaborative value can also find approaches to optimization within those constraints.
Resistance to vibe analytics in your organization may follow the same pattern as resistance to vibe coding, with critics often using identical language: “This isn’t real analysis” echoes “This isn’t real programming.” “This is irresponsible statistics” mirrors “This is irresponsible development.” These similarities aren’t coincidental: Established experts defend knowledge-as-artifact approaches against knowledge-as-emergence alternatives. They understandably (no pun intended!) see vibes as subverting the coherence and consistency of their disciplinary training — much like classically trained orchestral musicians initially viewed their jazz counterparts.
Thinking Bigger
Yes, improvisational exploration can get lost in digressions and irrelevancies; yes, data-driven insights decoupled from contextual relevance can mislead people as easily as they illuminate. But successful implementations can balance freedom with framework, improvisation with intention, and dialogue with direction. Vibe analytics creates spaces for talented observers, thinkers, and doers to test themselves and their hunches.
In the future, the vibe approach that started with coding and progressed to analytics has broader potential to transform organizations:
Vibe Coding → Vibe Analytics → Vibe Research → Vibe Strategy → Vibe Design
This expansion would follow a compellingly clear logic: Domains where human intent can productively guide machine pattern recognition through improvisational dialogue are prime candidates for transformation. Imagine not just more business domains embracing improvisational approaches, but the cumulative power of connecting these improvisational systems — where code improvisation feeds directly into analytical improvisation, which connects to strategic improvisation, creating new forms of organizational intelligence.