About The Author

Donald Hook is the founder of Full On Consulting, a technology and management consulting firm helping companies successfully leverage technology and deliver their initiatives.
He is a former Chief Technology Officer (CTO) and Partner for a $14B IT services firm with 50,000+ employees globally. He has run SI evaluations from both sides of the table — as the client choosing a vendor and as a senior leader inside a major SI. He knows exactly how these firms position themselves, what the demos conceal, and what questions expose real capability before you sign.
For information about Donald Hook, please visit LinkedIn. He can be reached at dhook@fullonconsulting.com
Related Articles
Published: March 2026 | Donald D. Hook
The vendor’s demo was genuinely impressive. Custom-built for the prospect, it showed exactly the workflows the client needed — branded with the client’s logo, populated with the client’s data, flowing through processes that looked indistinguishable from a finished system. The sales team knew the executives by name. They had done their homework. The proposal was polished, the timeline aggressive but credible, the price competitive. The references called back and said all the right things.
The contract was signed.
Two and a half years later, the platform had not been delivered. The vendor had underbid the project by nearly $1 million — and rather than renegotiate transparently, they quietly shifted their A-team to other accounts and left the client’s implementation running on whoever was available. When an independent technical review was finally conducted, the platform had no authorization layer. It was functionally unsecured. Core integrations had been built incorrectly. Platform best practices had been ignored entirely.
We were brought in after the vendor walked away. We replaced the implementation partner, remediated the security vulnerabilities, conducted CEO-level negotiations with the vendor and the hosting provider, and delivered the platform. The client had lost nearly $300,000 and two years of market opportunity. The CEO had been under board pressure the entire time.
This is a story about a failed implementation. But it is equally a story about a failed vendor selection. Because the warning signs were there — before a single line of code was written. The right evaluation process would have surfaced them.
Why the System Integrator Selection Process Is Broken
The way most organizations select system integrators is fundamentally misaligned with what they actually need to know. Evaluations are weighted toward presentation quality, proposal polish, and demo impressiveness — capabilities that large SIs have invested heavily in developing precisely because they have little correlation with delivery capability.
A global 2023 study found that 65% of enterprise software implementations fail to deliver expected business outcomes. The Standish Group Chaos Report consistently shows that only 31% of IT projects complete successfully — on time, on budget, with intended features. The majority of these failures share a common thread: the organization selected a vendor based on sales capability and then discovered, too late, that delivery capability was a different matter entirely.
65%
of enterprise software implementations fail to deliver expected outcomes
31%
of IT projects succeed on time, on budget, with intended features (Standish)
3–5×
average final cost of a failed ERP implementation versus a successful one
72%
of organizations report that the vendor they selected did not meet their expectations
The vendors know this. Large system integrators invest millions in their sales and pursuits organizations — the teams that win engagements — because winning the deal is the strategic objective. Delivery is a separate department, staffed differently, prioritized differently, and measured differently. The people who sold you the project are typically not the people who will deliver it.
Understanding this dynamic is the first and most important step in any SI selection. You are not evaluating a sales team. You are evaluating a delivery organization. They are not the same thing.
7 Red Flags That Expose the Wrong SI Before You Sign
These warning signs are consistently present in problematic SI relationships. None of them requires a formal evaluation process to surface — they are visible during the sales cycle if you know what to look for.
1. The demo does not include your actual complexity
Pre-built demos are designed to impress, not to validate. If the demo scenario does not include your specific technical environment — your actual ERP version, your integration points, your data volume, your edge cases — it tells you nothing about whether the vendor can actually deliver your project. Ask to see a live demonstration in a sandbox that mirrors your technical landscape. If they cannot or will not do this before signing, treat it as a significant warning.
2. The proposal team and the delivery team are different people
This is the most common and most consequential disconnect in SI engagements. The senior partners who walked you through the proposal will not be on your project. Ask, by name, who will be assigned as the project manager, the solution architect, and the lead developer on your engagement. Request their CVs. If the answer is vague — "we will assign the right resources" — that is a serious warning. A vendor that has already won your project should be able to tell you exactly who will deliver it.
3. The timeline is aggressive relative to the complexity
Vendors under competitive pricing pressure frequently compress timelines to make their proposals look more attractive — and then discover mid-engagement that the timeline was never realistic. Ask the vendor to walk you through the detailed logic behind their timeline. How did they arrive at the number of sprints, testing phases, and cutover activities? What assumptions are built in? What happens to the timeline if those assumptions prove incorrect? A vendor with genuine delivery experience can answer these questions in detail. One that cannot should concern you.
4. References are curated, not representative
Vendor-provided references are always the client relationships that ended well. They tell you what the vendor is capable of under favorable conditions. What you need to know is how the vendor behaves under unfavorable conditions — when the project is behind schedule, when scope has expanded, when resources are stretched. Ask the vendor to connect you with a client whose project experienced significant challenges. If they cannot provide one, ask why. Then go find your own references through your network, LinkedIn, and Gartner Peer Insights.
5. The SOW does not define what "done" means
Vague deliverable definitions are one of the most reliable predictors of a troubled engagement. If the SOW says things like "configure the system," "complete development," or "support go-live" without specific, measurable acceptance criteria, you do not have a contract — you have an invoice schedule. Every deliverable should have a defined definition of done, clear acceptance criteria, and a named owner. If the vendor resists this level of specificity during contracting, they will resist accountability during delivery.
6. The price is significantly lower than other bids
A materially lower price is not a competitive advantage — it is a risk indicator. Either the vendor has misunderstood your requirements (scope problem), underestimated the complexity of delivery (competency problem), or made the calculation that they will recover the margin through change orders once you are locked in (commercial problem). In any of these cases, the low bid will not stay low. Ask the low bidder to walk through the specific line items that explain the cost difference. Their answer will tell you which scenario you are in.
7. They cannot name a client similar to you in size and complexity
Relevant reference-able experience is not optional for a major IT implementation. If your project involves a specific ERP platform, industry vertical, data migration complexity, or integration landscape, the vendor should be able to name two or three completed engagements that are genuinely comparable. "We have done hundreds of implementations" is not an answer. Ask for specifics: same platform version, similar data volume, comparable number of integrations, similar organizational complexity. If the comparable engagements do not exist, the learning curve will happen on your project, on your timeline, at your cost.
How to Build an RFP That Reveals Actual Delivery Capability
Most IT RFPs are built by procurement teams using templates that were designed to evaluate commodity purchases. They ask the wrong questions — or they ask the right questions in a way that allows any vendor to write a convincing answer regardless of actual capability. The result is a process that filters for proposal-writing skill rather than implementation competency.
A well-designed SI evaluation RFP does something different: it creates situations where a vendor that lacks genuine capability cannot easily fake it. Here is how to build one.
| Weak RFP Question | Strong RFP Question |
|---|---|
| "Describe your experience with [platform]" | "Provide three completed engagements on [platform version X] with similar integration complexity to ours. Include the name of the client project manager we may contact, the timeline, and the actual go-live date versus original plan." |
| "Describe your implementation methodology" | "Walk us through the specific activities and sequencing in your testing phase, including how UAT, performance testing, and cutover rehearsal are scheduled and who owns each." |
| "Describe your team structure" | "Name the individuals who will serve as PM, solution architect, and technical lead on our engagement. Provide CVs. Contractually commit to their availability as a condition of award." |
| "How do you handle scope changes?" | "Provide a sample change order from a prior engagement (redacted) that shows how scope was evaluated, priced, and documented. Walk us through your change control governance process." |
| "Describe your quality assurance approach" | "What are your exit criteria for each phase? Who approves phase completion? What has caused you to delay a phase transition on a prior project, and how was it resolved?" |
| "Tell us about a challenging project" | "Describe a project where you were behind schedule at the midpoint. What specific actions did you take? What was the final outcome versus original baseline? We will verify this with the client reference." |
The objective of each strong question is the same: require specificity that cannot be faked. Generic answers to specific questions are themselves a data point. A vendor with real delivery experience in your domain can answer these questions in concrete detail. One that cannot — or that deflects into generalities — is telling you something important about the depth of their actual experience.
Reference Checks That Actually Produce Honest Answers
The standard vendor reference check is nearly useless. You call a number the vendor gave you. The person on the other end says positive things. You move on. This process is designed to confirm the decision you have already largely made — not to surface genuine risk.
Effective reference checks require a different approach. They should be structured to create conditions where a reference who had a genuinely poor experience can share it — and where a positive reference is forced to be specific enough to be meaningful.
Go beyond vendor-provided references
Search LinkedIn for people who list the vendor as a prior implementation partner. Look at Gartner Peer Insights, G2, or Clutch for reviews. Ask your industry peers whether anyone has used this vendor. The client who had a bad experience will not appear on the vendor's reference list — but they are out there.
Ask to speak with the delivery team, not the account team
The account executive and the delivery project manager have different stories. Ask to speak with the person who actually managed day-to-day delivery on the reference engagement — not the partner who sold it or the executive sponsor who received status reports.
Ask the specific hard questions
"Was the project on time and on budget?" "If not, what happened?" "Were the people who delivered the project the same ones who were on the proposal?" "If you were doing this again, would you hire this vendor?" "What would you tell us that they would not want you to say?"
Verify the timeline claims independently
If a reference engagement is notable enough to appear in the vendor's proposal, you can often verify the timeline from public records — press releases, LinkedIn posts, conference presentations. A project the vendor says they delivered in 12 months that LinkedIn evidence places at 24 months tells you something important about how they describe their own track record.
The Finalist Stage: What to Require Before You Sign
Once you have narrowed to two or three finalists, the standard approach is an oral presentation followed by contract negotiation. This is where most organizations lose significant leverage — and miss their last opportunity to surface critical risks before the contract is signed.
Before you award the contract, require the following from every finalist:
A technical prototype in your actual environment
Not a demo in their sandbox — a working proof-of-concept built in a development environment that mirrors your actual technical landscape. This does not need to be large. It needs to be real. How quickly they can stand it up, and what problems they encounter, will tell you more than any presentation.
A detailed project plan, not a roadmap
Require the finalists to produce a project plan with specific tasks, dependencies, named resource assignments, realistic durations, and a logic-driven timeline. Vague swim-lane roadmaps are marketing documents. A real project plan represents real planning. The quality of the plan reveals the quality of the delivery organization.
Named resources committed contractually
Any key delivery resource named during the evaluation — project manager, solution architect, technical lead — should be contractually committed to your engagement for a defined minimum period. Without this, the people you evaluated will be assigned elsewhere and you will receive whoever is available.
A milestone-based payment structure tied to deliverable acceptance
Payment schedules that front-load fees in favor of the vendor create perverse incentives. A payment structure tied to the delivery and client acceptance of defined milestones aligns incentives and gives you commercial leverage at every phase. Any vendor that resists this structure is telling you how they expect the engagement to go.
Clear definitions of done for every deliverable
Every deliverable in the SOW — every phase, every module, every integration — should have specific, measurable acceptance criteria that both parties sign off on. "System configured" is not a deliverable definition. "CRM configured per the approved business requirements document, with all 14 integrations tested and passing acceptance criteria defined in Appendix B" is a deliverable definition.
An escalation and governance framework
Define, in the contract, what happens when things go wrong: who escalates to whom at what threshold, what triggers a formal project health review, and what happens if the vendor misses a milestone by more than a defined number of days. This is not pessimism — it is the governance framework that protects both parties and gives the relationship a path forward when problems arise.
The One Decision That Substantially Reduces Your Risk: Bring in an Independent Advisor
The most reliable way to substantially reduce the risk of a failed SI selection is to have someone on your side of the table who knows how the process works from the inside — someone who has been in the room as the vendor, knows how proposals are assembled, understands where the numbers come from, and recognizes the warning signs before the contract is signed.
An independent client-side IT advisor does something that an internal team — even a strong one — typically cannot do on their own: they bring experience from dozens of similar evaluations, pattern-match against known failure modes, and ask the questions that a first-time buyer would not know to ask.
This is not the same as an IT negotiation advisory firm that specializes in commercial leverage and contract terms. Commercial expertise is valuable — but it is upstream of execution. Knowing that you negotiated a strong SOW matters much less if the vendor you selected cannot actually deliver. An advisor with genuine delivery experience can evaluate both the commercial structure and the execution credibility of the vendor you are considering.
What an Independent Client-Side IT Advisor Does During an SI Selection
- ✓Builds and manages the RFP process — structuring questions that reveal actual delivery capability rather than sales capability
- ✓Scores proposals against a consistent, objective evaluation framework — removing subjectivity and vendor relationship bias from the decision
- ✓Leads technical deep-dives and architect-to-architect conversations that surface capability gaps vendor sales teams will not voluntarily disclose
- ✓Conducts independent reference checks — including outreach to clients not on the vendor's approved list
- ✓Reviews and strengthens the SOW before it is signed — ensuring deliverable definitions, payment terms, acceptance criteria, and governance provisions protect the client
- ✓Validates the project plan during finalist evaluation — assessing whether the timeline and resource plan is credible before contract execution
- ✓Provides a recommendation that is aligned entirely with the client's interests — with no commercial relationship with any vendor in the evaluation
For organizations undertaking major ERP implementations, digital transformation programs, or any engagement with a system integrator where the contract value exceeds $500K — the cost of an independent evaluation is typically recovered many times over in avoided risk, negotiated savings, and better vendor selection outcomes. For a $2M implementation, a 10% improvement in execution quality or a 15% reduction in unnecessary scope represents a return that substantially exceeds the advisory fee.
For organizations that have already signed a contract and are beginning to see warning signs — the same advisor capability that would have prevented the problem is the same capability that can intervene before it becomes catastrophic. An independent project health check is often the fastest way to understand exactly where you stand and what you need to do about it.
How Full On Consulting Reduces Your SI Selection Risk
Full On Consulting’s senior consultants have been on both sides of this table. Our founder was a Partner and CTO at a $14B IT services firm with 50,000 employees globally — one of the organizations that builds the polished demos, assembles the pursuit teams, and trains the proposal writers. He knows precisely how the sales process is designed to work, where the gaps between proposal and delivery exist, and which questions force vendors to show you what is actually behind the curtain.
That experience is what we bring to every SI selection engagement. We are not a commercial advisory firm optimizing contract terms. We are a delivery-side advisor helping you choose the right vendor and structure the right engagement — because we have delivered these programs ourselves, recovered them when they went wrong, and built the governance frameworks that prevent failure in the first place.
Our SI selection advisory work includes:
RFP Design & Management
We build the RFP from scratch — structured to surface real delivery capability, not proposal polish. We manage the evaluation process end-to-end and protect your team's bandwidth.
Proposal Scoring & Analysis
We evaluate proposals against an objective scoring framework and translate what vendors are actually telling you — including what they are deliberately not saying.
Technical Deep-Dives
We conduct architecture reviews and technical interviews with proposed delivery resources — conversations that go well beyond what a business stakeholder can effectively evaluate.
Independent Reference Checks
We go beyond vendor-provided references to surface honest assessments from clients who experienced the full arc of a delivery engagement — not just the successful ones.
SOW Review & Strengthening
We review the statement of work before it is signed and identify the vague language, missing acceptance criteria, and governance gaps that create risk once the engagement begins.
Ongoing Vendor Accountability
Our PMaaS and IT Project Management services keep a senior Full On consultant engaged as your client-side PM throughout delivery — so accountability doesn't end at contract signature.
The eCommerce engagement described at the opening of this article would have ended differently with independent advisor involvement. The vendor’s underbid — and what it implied about how they planned to manage the engagement — would have been identifiable before contract signature. The absence of authorization-layer standards in their technical approach would have surfaced in a pre-signature architecture review. The reference check process would have found clients with similar stories.
That is not hindsight. That is the evaluation process working as it should.
Get Independent SI Advisory
Before you sign with an implementation partner, talk to an advisor who has been on both sides of the table.
Schedule a Free ConsultationSI Selection Checklist
- ☐Named delivery resources committed contractually
- ☐Technical prototype in your actual environment
- ☐Detailed project plan with dependencies
- ☐Milestone-based payment tied to acceptance
- ☐Defined acceptance criteria for every deliverable
- ☐Independent reference checks completed
- ☐SOW reviewed by an independent advisor
- ☐Escalation and governance framework defined
- ☐Change order process documented and agreed
- ☐Exit provisions defined in the contract

