Skip to content Skip to sidebar Skip to footer

The Board Presentation That Kills AI Projects (And What to Show Instead)

AI Projects

Your AI proposal gets board approval, the budget request for substantial investment passes with a unanimous vote.

Board members shake hands enthusiastically and congratulate you on the forward-thinking initiative.

Leadership expresses confidence that this will transform operations and position the organization competitively. The approval feels like success, like the hard part is over.

Six months later, the same board is questioning the investment with growing skepticism. They scrutinize every expense report, ask pointedly why promised results haven’t materialized.

They wonder aloud whether the whole AI initiative was a mistake. The project manager feels blindsided because the technology is working as designed. The vendor delivered what they promised, the implementation followed best practices, so why does the board consider this a failure?

The project didn’t fail during implementation, It did during that board presentation six months ago when expectations were set that could never be met, risks were hidden that would inevitably surface, and approval was secured through presentations that doomed the project before a single line of code was written.

Understanding what kills AI projects in boardrooms is the first step toward presenting proposals that lead to actual success rather than approved failures.

The Five Presentations That Kill AI Projects

Technology Demo
Credit: Freepik

Technology demo showcases AI capabilities without connecting them to business value. These presentations feature slides demonstrating what AI can do.

Natural language processing can analyze customer sentiment. Machine learning algorithms predict equipment failures before they happen.

Computer vision automates quality inspection with superhuman accuracy. The capabilities are impressive, the demonstrations are slick and the board nods appreciatively at the technological sophistication.

However, this presentation kills projects because boards don’t fund technology for its own sake. They fund business outcomes. Showing what AI can do is academically interesting but showing the specific business problem worth solving is what gets meaningful support.

When approval comes based on technological wonder, the board expects transformational results that match the impressive capabilities.

When you deliver working technology that doesn’t actually solve a critical business problem, the disconnect between expectation and reality destroys support. The board approved innovation, you delivered functionality, those aren’t the same thing.

Unrealistic ROI Projection
Credit: Freepik

These presentations feature charts showing dramatic returns on investment, often in the range of multiple hundreds of percent within the first year.

The numbers come from vendor success stories where similar AI implementations paid for themselves in months.

Graphs illustrate massive cost reductions and efficiency gains that make the investment appear obviously worthwhile. The financial case seems bulletproof.

This presentation kills projects because vendor ROI calculations assume infrastructure realities that don’t exist in African business environments.

Vendor numbers presume stable electrical power without generator costs. They assume reliable internet without expensive backup connectivity. They price services in dollars without accounting for currency fluctuation against the naira for instance.

The math that worked for companies operating in the United States or Europe doesn’t hold when you factor in the infrastructure tax that Nigerian operations pay.

Board approves based on vendor projections. Actual costs run two or three times higher than presented, the ROI evaporates. Board members lose trust in your judgment and question all future estimates with heightened skepticism.

Vague Timeline

 

This promises AI implementation without specific milestones or decision points. These presentations include Gantt charts showing implementation phases spread across quarters.

They reference general stages like planning, development, testing, and deployment. They commit to completing the project within defined timeframes. What they don’t include are concrete milestones with measurable outcomes or clear criteria for evaluating success at each phase.

This presentation kills projects because boards cannot evaluate progress without objective milestones.

When you report that you’re “making good progress” six months into implementation, that statement is meaningless without shared definitions of what progress looks like.

Board members ask whether you’re on track, and you can’t answer definitively because “on track” was never defined with specificity.

The ambiguity gets interpreted as hiding problems. Trust erodes as board members suspect you’re concealing difficulties rather than acknowledging you simply never established clear progress markers.

Everything Everywhere Proposal
Credit: Freepik

The fourth killer presentation is the everything everywhere proposal that attempts to transform the entire organization simultaneously.

These presentations propose deploying AI across customer service, operations, finance, and human resources all at once.

They promise organization-wide rollout where every department benefits from AI capabilities. They commit to total transformation within ambitious timeframes, often a year or less.

This presentation kills projects because boards recognize unfocused ambition when they see it.

Attempting everything creates too many dependencies where progress in one area requires coordination with three others.

It creates too many potential points of failure where problems anywhere threaten progress everywhere.

It provides no clear priority for when resources become constrained, which they inevitably do. When you try to do everything, you deliver nothing particularly well.

Board members watch months pass with no completed wins, just reports of “progress” across multiple fronts that never seem to reach completion.

Support erodes as the lack of concrete achievements makes the whole initiative feel like it’s spinning wheels.

Vendor Brochure
Credit: Vistaprint

These presentations include slides that look suspiciously similar to vendor decks. They feature glowing case studies from other companies, list impressive features and capabilities.

They present the solution as obviously superior to alternatives. What they conspicuously lack is honest discussion of risks, acknowledgment of challenges, or realistic assessment of what could go wrong.

This presentation kills projects because board members wonder whether you’ve actually thought this through independently or simply bought vendor promises wholesale.

The absence of risk discussion signals either naivety about implementation complexity or dishonesty about known challenges.

Neither interpretation inspires confidence. When inevitable problems surface during implementation, as they always do, board members question why these weren’t discussed during the approval presentation.

Your credibility suffers permanent damage. Future proposals face heightened skepticism because you’ve established a pattern of presenting optimistic cases that don’t match reality.

The Five Elements of Project-Saving Presentations

The first element is quantifying the business problem with specific numbers and competitive context.

Credit: Freepik

Don’t present vague aspirations about improving customer service with AI. Instead, present concrete data about the problem’s cost.

Frame it something like this: the organization loses 47 customers monthly, representing approximately substantial annual revenue loss, specifically because response times average 48 hours.

Customer surveys and market research show that 4-hour response times have become standard market expectations. Competitors meeting this standard are actively taking market share. The customer feedback is clear and consistent about this being a decision factor.

This approach works because the board immediately understands the business case in terms they care about.

The second element is presenting realistic costs that explicitly account for African Nigerian infrastructure realities.

Credit: Freepik

Break down the complete budget to show base implementation costs, infrastructure requirements like generators and UPS systems and backup connectivity, currency fluctuation buffers for dollar-denominated services, anticipated vendor fees and support upgrades beyond initial quotes, and comprehensive change management and training programs. Present the total realistic budget rather than just the optimistic base number.

This approach works because the board approves the actual number you’ll need rather than an artificially low number that makes the project look more attractive than it is. When final costs align with what you presented, you’re on budget and credible.

If you present optimistic numbers and actual spending runs significantly higher, you’ve destroyed trust regardless of project success.

The third element is establishing 90-day milestones with explicit kill criteria that demonstrate you’ll stop if things aren’t working.

Credit: Salesgravy

Structure the first month around deploying a pilot in a specific department with a clear success metric like reducing response time from 48 hours to under six hours for the pilot group.

Include kill criteria stating that if response times don’t improve by at least 50 percent, you’ll pause to diagnose issues before proceeding further.

Structure the second month around scaling the pilot to the full team with success criteria around maintaining performance under full volume.

Include kill criteria specifying that if the system can’t handle production loads or quality deteriorates, you won’t expand to additional departments.

Structure the third month around documenting ROI and presenting the Phase 2 proposal, with success criteria demonstrating measurable cost reduction and customer retention improvement.

Include kill criteria making clear that if ROI doesn’t materialize as projected, you’ll stop rather than scale a solution that isn’t performing.

This approach works because board members can evaluate objective progress every 30 days rather than waiting months for vague status updates. The kill criteria demonstrate you’re not asking for a blank check to pursue AI regardless of results.

The fourth element is providing honest readiness assessment that shows you understand success requires more than buying software.

Credit: Freepik

Present current data quality status showing what percentage of relevant data is structured versus unstructured, what percentage has complete fields versus missing critical information, and what cleanup work is required before AI implementation.

Acknowledge the gap between current state and what AI requires, then specify the timeline for closing that gap. Present current team capability honestly, noting where staff lack necessary experience with AI tools or relevant technical skills.

Specify the training programs and external support needed to build capability, including realistic timelines. Present current infrastructure limitations around power reliability, internet connectivity, and computing capacity.

This approach works because it demonstrates you’ve thought beyond the technology to organizational readiness. Board members see that you understand AI success depends on foundations being solid before implementation begins.

The fifth element is explicitly discussing what could go wrong along with mitigation plans for each major risk.

Credit: Freepik

Identify adoption resistance as a likely risk where customer service teams might refuse to use the system, work around it, or sabotage through non-compliance.

Present mitigation plans including involving the team in pilot selection, providing comprehensive training, tying performance bonuses to adoption metrics, and building in feedback loops to adapt based on user input.

Identify data quality issues as a risk where AI might perform poorly because underlying data has problems that weren’t fully assessed during planning. Present mitigation including running thorough data quality audits before final commitment, building data cleanup explicitly into project timelines, and accepting longer implementation if data work requires it.

Identify inadequate vendor support as a risk where vendors are responsive during sales but disappear after contracts are signed. Present mitigation including contractual requirements for active support over defined periods, service level agreements with financial penalties for non-performance, and internal team training for basic maintenance.

This approach works because it demonstrates thinking that extends beyond optimism to realistic planning for adversity. Boards respect proposals that anticipate problems rather than presenting everything as certain to go smoothly.

Board Presentations Set Project Trajectory

AI projects that die in boardrooms rarely die from explicit rejection where boards vote down proposals and deny funding. They die from approvals based on unrealistic expectations, hidden risks, and vague timelines that make eventual failure inevitable.

When reality inevitably diverges from what was presented during board approval, board members lose confidence in project leadership.

The gap between promise and performance undermines support regardless of whether the implementation is technically successful.

Projects get questioned, scrutinized, and eventually killed because they’re being measured against impossible standards that were set during presentations designed to secure approval rather than establish realistic expectations.

The presentation that saves AI projects takes the harder path of setting honest expectations from the start.

It doesn’t oversell capabilities or promise transformation beyond what’s achievable. It doesn’t hide risks or present optimistic scenarios as certain outcomes. It doesn’t ask for everything upfront but instead earns trust through phased commitments with clear milestones.

Get the board presentation right, and implementation challenges become manageable because everyone understood they were possible.

Get the presentation wrong, and even successful implementation looks like failure when measured against expectations that were never realistic.

 

Leave a comment