
Sample Deliverables Timeline for a 12 Week MVP: 2026 Guide
Your sample deliverables timeline for a 12 week mvp—week-by-week checklist for discovery, design, dev, QA, and launch. Use it to plan and vet proposals.
TL;DR
A 12-week MVP timeline typically breaks into six phases: discovery and scoping (weeks 1-2), design and prototyping (weeks 3-4), core development (weeks 5-8), feature completion and integrations (weeks 9-10), testing and QA (week 11), and launch with handoff (week 12). The key difference between a useful timeline and a vague one is that each phase maps to concrete deliverables you can review, approve, and own. This article provides the actual week-by-week deliverables table that most guides skip, plus the context you need to evaluate whether a proposed timeline is realistic.
What Is an MVP Deliverables Timeline?
An MVP deliverables timeline is a structured schedule that maps specific, named outputs to each week or phase of a minimum viable product build. It answers the question every founder eventually asks: “What exactly am I getting, and when?”
This is different from a project plan, which tracks tasks and team assignments. It’s also different from a product roadmap, which shows strategic direction over months or years. A deliverables timeline is focused entirely on what you, as the person paying for the build, physically receive at each milestone: a signed scope document, a clickable prototype, a staging environment, a production deployment.
The distinction matters because 37% of projects fail when stakeholders never agree on what success looks like. A deliverables timeline forces that agreement upfront. It gives both sides, founder and development team, a shared definition of “done” at every checkpoint.
Most founders searching for a sample deliverables timeline for a 12-week MVP have already been quoted a 12-week window by an agency or freelancer, or they’re building an internal plan and want to know if their assumptions are reasonable. Either way, the goal is the same: a concrete reference you can hold people accountable to.
The 12-Week MVP Deliverables Timeline
This is the core artifact. Below is a phase-by-phase breakdown with named deliverables for each stage of a 12-week MVP build. This sample assumes a medium-complexity product (think SaaS app, two-sided marketplace, or mobile-first platform with third-party integrations), which multiple sources place in the 8-to-12-week range for experienced teams.
A quick note: this is a reference template. Your actual timeline will shift based on scope, team size, and technical complexity. But this gives you the structure to compare against.
Phase 1: Discovery and Scoping (Weeks 1-2)
This phase defines the problem, the user, and the boundaries of the build. Skipping it is one of the most reliable ways to blow a timeline. One agency that has shipped over 40 MVPs put it bluntly: “Building a full product in stealth for 12 to 18 months is the most expensive way to find out you are wrong.” Discovery prevents that.
Deliverables you should receive:
| Deliverable | Description | Who Reviews |
|---|---|---|
| Problem statement | One-page document defining the core problem and target user | Founder + product lead |
| User personas (2-3) | Profiles of primary user types with goals, pain points, and behaviors | Founder |
| Prioritized feature backlog | Features ranked using MoSCoW (Must-Have, Should-Have, Could-Have, Won’t-Have) | Founder + dev lead |
| Technical architecture sketch | High-level diagram showing frontend, backend, database, and third-party services | CTO or technical advisor |
| Success metrics | 3-5 measurable criteria that define MVP success (e.g., 100 signups, <2s load time) | Founder |
| Signed scope document | Formal agreement on what’s in and out of scope, with acceptance criteria | Both parties |
MoSCoW prioritization is worth calling out specifically. It prevents the most common timeline killer: scope creep. Research shows that 62% of projects experience budget overruns from uncontrolled scope expansion, and scope creep can cost up to four times the initially expected development cost. Drawing a hard line between “must have” and “nice to have” in week one is cheap insurance.
Phase 2: Design and Prototyping (Weeks 3-4)
Design is where abstract requirements become something you can see and click. The goal is a prototype realistic enough to test with users and align the development team, but not so polished that it eats into build time.
Deliverables you should receive:
| Deliverable | Description | Who Reviews |
|---|---|---|
| Wireframes | Low-fidelity layouts for all core screens | Founder + designer |
| Clickable Figma prototype | Interactive prototype simulating the main user flows | Founder + test users |
| User flow diagrams | Visual maps of how users move through signup, core action, and checkout/output | Product lead |
| UI style guide / design tokens | Colors, typography, spacing, component library foundations | Dev team + designer |
The clickable prototype is arguably the most valuable deliverable in the entire 12-week MVP timeline. It’s the cheapest way to validate whether your user experience makes sense before you’ve written a single line of production code. If your agency or dev team skips this step and jumps straight to code, that’s a red flag.
If your MVP is a marketplace, platforms like Sharetribe can compress both the design and development phases significantly, since the core transaction flows already exist.
Phase 3: Core Development, Sprints 1-2 (Weeks 5-8)
This is where the product takes shape. Using agile methodology, the team builds in two-week sprints, each ending with a working demo you can review. The first sprint focuses on infrastructure and the primary user flow. The second sprint builds out secondary features and the admin interface.
Deliverables you should receive:
| Deliverable | Description | Who Reviews |
|---|---|---|
| Staging environment access | A live URL where you can test the product as it’s built | Founder + QA |
| Authentication system | User registration, login, password reset, session management | Founder |
| Database schema | Documented data structure (ERD or equivalent) | Technical advisor |
| Core feature module(s) | The 1-2 features that define your MVP’s primary value | Founder + test users |
| Sprint 1 demo (end of Week 6) | Live walkthrough of working features with Q&A | Full team |
| Sprint 2 demo (end of Week 8) | Updated demo showing additional features and refinements | Full team |
Sprint demos matter more than most founders realize. They’re your checkpoint for catching misalignment early. If you’re not seeing a working demo every two weeks, you’re operating blind.
For a real-world example of milestone-based delivery, Bloom (YC W21) had its paper-trading feature delivered in roughly two months using this kind of phased approach.
Phase 4: Feature Completion and Integrations, Sprint 3 (Weeks 9-10)
By this point, the core product works. This phase adds the integrations and secondary features that make it complete enough to launch: payment processing, email notifications, third-party APIs, and edge case handling.
Deliverables you should receive:
| Deliverable | Description | Who Reviews |
|---|---|---|
| Payment integration | Stripe, PayPal, or relevant processor connected and tested | Founder + QA |
| Third-party API integrations | Email (SendGrid/Postmark), analytics (Mixpanel/Amplitude), etc. | Dev lead |
| Edge case handling | Error states, empty states, input validation, rate limiting | QA |
| Sprint 3 demo (end of Week 10) | Feature-complete walkthrough | Full team |
| Updated feature backlog | Remaining items moved to post-launch roadmap with priority | Founder + product lead |
This is where scope discipline gets tested hardest. Every founder sees the near-finished product and thinks “just one more feature.” Resist it. The Standish Group found that 45% of features in typical software are never used, and another 19% are rarely used, meaning 64% of what gets built is waste. Your MVP should include only what’s needed to test your core hypothesis.
Phase 5: Testing and QA (Week 11)
A dedicated testing phase is not optional. It’s where you find the bugs that would embarrass you on launch day.
Deliverables you should receive:
| Deliverable | Description | Who Reviews |
|---|---|---|
| QA test report | Documented test cases with pass/fail status | Dev lead + founder |
| Bug fix log | List of identified bugs, severity, and resolution status | Dev lead |
| Performance benchmarks | Load time, API response times, database query performance | Technical advisor |
| Cross-device/browser verification | Testing across Chrome, Safari, Firefox, iOS, Android | QA |
| Security review | Authentication flow audit, input sanitization, HTTPS verification | Dev lead |
Functional testing checks if features work. Usability testing checks if they’re easy to use. Performance testing checks if things hold up under load. Security testing looks for vulnerabilities. All four should appear in the QA report.
Phase 6: Launch and Handoff (Week 12)
The final week covers production deployment and the transfer of everything you need to own and operate your product independently. This is where many agencies fall short, and where you should be most demanding.
Deliverables you should receive:
| Deliverable | Description | Who Reviews |
|---|---|---|
| Production deployment | Live application on production infrastructure | Full team |
| Source code transfer | Full repository access (GitHub/GitLab) with commit history | Founder + CTO |
| Infrastructure credentials | AWS/Vercel/GCP admin access, database credentials | Founder + technical advisor |
| Third-party account access | Stripe, email service, analytics, all API keys with descriptions | Founder |
| Documentation package | README, setup instructions, environment variables, deployment guide | Dev team |
| Database schema / ERD | Updated data model documentation | Technical advisor |
| API documentation | Endpoint reference with request/response examples | Dev team |
| Monitoring and alerting setup | Uptime checks, error tracking (Sentry or equivalent), log aggregation | Dev lead |
Practitioner CTO Vadim Kravcenko offers pointed advice on handoffs: “Plan for at least a month of overlap; three is safer. We once tried to compress it into two weeks and paid with a full day of downtime, database migrations mis-sequenced, monitoring alarms screaming.” His recommendation is to expect “real, living docs,” not afterthought documentation. Source
This handoff checklist draws from Webscension’s MVP handoff guide, which is one of the better practitioner resources available on the topic.
What Makes a 12-Week Timeline Realistic (vs. Fantasy)
Not every 12-week MVP deliverables timeline is achievable. Here’s how to tell if yours is grounded in reality.
Complexity Determines Everything
The 12-week window fits medium-complexity MVPs: SaaS products with user accounts, a core workflow, payment processing, and a few integrations. Simple landing-page-plus-waitlist products can ship in 4-6 weeks. Complex products involving AI, regulatory compliance, or native mobile apps on multiple platforms often need 16-24+ weeks.
Your tech stack choice directly affects how fast you can ship. Choosing proven frameworks (React, Node, Python, PostgreSQL) over bleeding-edge tools reduces debugging time and makes hiring easier if you need to scale the team later.
The Numbers That Should Worry You
Several statistics frame why timeline discipline matters:
- 42% of startups fail because they build something nobody wants. Discovery in weeks 1-2 exists to prevent this.
- 70% of startups scale prematurely, building features and hiring people before validating demand.
- 62% of projects experience budget overruns from scope creep.
- 64% of software features are rarely or never used.
These aren’t abstract warnings. They directly argue for a disciplined, phased timeline over an open-ended build.
Red Flags in a Proposed Timeline
Watch for these warning signs:
- No discovery phase. Jumping straight to design or code means nobody agreed on what “done” looks like.
- No sprint demos. If you won’t see working software until week 10, you’re taking a massive bet.
- No acceptance criteria. Every user story should define what “complete” means before development starts.
- Vague phase ranges. A proposal that says “development: weeks 4-12” without sprint milestones is a plan to miss deadlines.
Two Models for a 12-Week MVP: Lean Startup vs. Agency Dev
There’s an important distinction that most sample deliverables timelines for a 12-week MVP gloss over. Two valid approaches exist, and they allocate time very differently.
The Lean Startup Model
The 12-week framework from Neo Innovation (now Pivotal Labs) spends weeks 1-4 on customer interviews and problem validation before writing any code. Teams interview roughly 20 prospective customers, run concierge experiments, and validate demand before touching a keyboard. Development happens in weeks 5-12.
This model makes sense when the business idea itself is unproven. If you’re not sure people will pay for what you’re building, spending four weeks talking to potential customers is the highest-ROI use of your timeline.
The Agency Development Model
The timeline in this article follows the agency model: two weeks of discovery, then ten weeks of design, development, testing, and launch. This model assumes you’ve already validated the core idea (through customer conversations, a waitlist, a manual version of the service) and need to build the product.
Y Combinator’s David Mack captured the philosophy behind fast shipping: “Our MVP gestation time kept shrinking, from three months, to one week, to one day, down to a few hours.” The point isn’t that every MVP should ship in hours. It’s that the fastest path to learning is getting something in front of users. Source
Choose the lean model if you’re testing a hypothesis. Choose the agency model if you’re building a product you already know people want.
Deliverables Founders Often Miss
Beyond the obvious outputs, several deliverables tend to fall through the cracks. These are worth adding to your checklist:
Architecture Decision Records (ADRs). Short documents explaining why the team chose specific technologies or approaches. Without these, a future developer inherits code with zero context about the tradeoffs that shaped it. Teams with strong process discipline include ADRs as standard practice, along with weekly demos and clean handoffs.
Acceptance criteria per user story. Every feature should have a written definition of “done” before development starts. Without it, “the search feature works” could mean anything.
Environment variable documentation. A list of every configuration variable, what it does, and where it’s used. Without this, deploying to a new environment becomes guesswork.
Deployment runbook. Step-by-step instructions for deploying the application, rolling back a bad deploy, and restarting services. This is different from the README.
Post-launch monitoring and alerting. Uptime monitoring, error tracking, and performance alerts should be configured before launch, not after your first outage.
Warranty terms and bug-fix SLA. What happens when something breaks after handoff? The contract should specify a warranty period and response time for defects within the original scope.
How to Use This Timeline to Evaluate Agency Proposals
If you’re comparing proposals from agencies or freelancers, this sample deliverables timeline for a 12-week MVP gives you a baseline to evaluate against. Here’s how:
Match proposed milestones against this reference. If an agency’s proposal doesn’t mention sprint demos, a testing phase, or a handoff checklist, ask why. These aren’t extras. They’re standard for a well-run build.
Ask what’s included at each milestone before signing. “Week 6: sprint demo” is better than “Weeks 4-10: development.” Named deliverables with review points protect both sides.
Look for milestone-based invoicing. Payment should be tied to deliverables, not calendar dates. You pay when the staging environment is live and demoed, not simply because it’s week 6.
Confirm code ownership and handoff process upfront. You should own 100% of the intellectual property. Repository access, infrastructure credentials, and third-party accounts should all transfer to you at project end. If a proposal is vague about this, push for specifics before signing.
Check for a warranty period. Bugs happen. A warranty (six months is a strong benchmark) that covers defects within the original scope gives you a safety net after launch.
If you want to see what milestone-based delivery looks like across 60+ shipped products, comparing real case studies against this timeline can calibrate your expectations.
Mapping Deliverables to Payment Milestones
Here’s a sample payment structure tied to the deliverables timeline:
| Milestone | Trigger | Typical % |
|---|---|---|
| Project kickoff | Signed scope document | 15-20% |
| Design complete | Approved Figma prototype | 15-20% |
| Sprint 1 demo | Working staging environment with core feature | 20% |
| Sprint 3 demo | Feature-complete product | 20% |
| Launch and handoff | Production deployment + documentation | 20-25% |
This structure aligns incentives. Nobody gets paid for time, only for delivered, reviewed outputs.
Common Pitfalls That Blow a 12-Week Timeline
Having seen what a healthy 12-week MVP deliverables timeline looks like, here are the most common ways it goes wrong.
Scope creep mid-sprint. Adding features after a sprint has started forces context-switching and breaks estimates. Every addition should go through a formal change request process, with the timeline and cost impact stated clearly before approval.
Skipping discovery. Jumping straight to code feels faster. It isn’t. An engineer on Hacker News shared a common experience: “I had a few experiences building MVPs but it always took between 45 and 90 days to build it.” That 45-to-90-day range assumes a solid discovery phase. Without one, timelines stretch unpredictably.
No single decision-maker. If every design decision requires a committee vote, sprints stall. Appoint one person (usually the founder) as the final decision-maker for product questions.
Treating the MVP as a full product. The purpose of an MVP is to learn, not to launch a polished platform. If you’re adding admin dashboards, analytics integrations, and email marketing automation in the first build, you’re past MVP territory.
No testing buffer. Some agencies compress testing into the final two days. That’s not enough. A full week of dedicated QA (week 11 in this timeline) is the minimum for catching critical bugs before real users hit your product.
Slow feedback loops. If it takes you three days to review a sprint demo and provide feedback, that delay ripples through every subsequent sprint. Block time on your calendar. Treat review sessions as non-negotiable.
What Happens After Week 12
Launching the MVP is not the finish line. It’s the starting line. Once you’re live, the work shifts to gathering user feedback, measuring against your success metrics, and deciding what to build (or cut) next.
For a practical guide on the post-launch phase, this step-by-step plan for going from MVP to market covers the transition from first users to product-market fit.
Planning a growth retainer with your development team (even at reduced hours) gives you coverage for bug fixes, urgent feature requests, and the monitoring that keeps your product running smoothly while you focus on customers.
Get a Custom Deliverables Timeline for Your MVP
Every product is different. The sample deliverables timeline above gives you the framework, but the specifics depend on your scope, your market, and your constraints. If you’re planning a 12-week build and want a timeline tailored to your product, get a free estimate and 30-minute consultation to see what a realistic plan looks like for your specific situation.
FAQ
How long does it really take to build an MVP?
For custom-coded products of medium complexity (SaaS, marketplaces, mobile apps with integrations), 8-12 weeks is the realistic range with an experienced team. Simple products with minimal features can ship in 4-6 weeks. Complex products involving AI, native mobile apps, or regulatory requirements often take 16-24+ weeks. Practitioners on Hacker News consistently report 45-90 days as the norm for custom-code MVPs.
What should a 12-week MVP cost?
Costs vary widely based on complexity and team location. Simple no-code MVPs run $5K-$15K. Custom-coded MVPs with integrations typically fall between $30K-$60K. Complex builds (AI, fintech, healthcare) can reach $60K-$150K+. The key is tying payments to deliverables, not hours or calendar dates.
What’s the difference between a deliverables timeline and a project plan?
A project plan tracks tasks, assignments, and dependencies for the development team. A deliverables timeline tracks what the client receives and when. The deliverables timeline is the founder-facing artifact. You shouldn’t need to read Jira tickets to know if your project is on track.
Can I build an MVP in less than 12 weeks?
Yes, depending on scope. Y Combinator encourages founders to ship as fast as possible, sometimes in days. If your MVP is a landing page with a waitlist, a concierge service, or a no-code prototype, you don’t need 12 weeks. The 12-week sample deliverables timeline applies to custom-built software products that require design, development, and testing.
What happens if the scope changes during the build?
Scope changes should go through a formal change request process. The development team documents the requested change, estimates the impact on timeline and cost, and you approve or decline before work begins. Without this process, scope creep becomes inevitable, and research suggests it can cost up to four times the original estimate.
Should I expect source code ownership after the MVP is delivered?
Absolutely. You should own 100% of the intellectual property, including the full source code repository, all infrastructure credentials, third-party account access, and documentation. Any agency that retains ownership of your code is a dealbreaker. Confirm this in writing before the project starts.
How do I know if my agency’s proposed timeline is realistic?
Compare it against the sample deliverables timeline for a 12-week MVP in this article. Look for named deliverables at each phase, sprint demos every two weeks, a dedicated testing phase, and a detailed handoff checklist. If the proposal is vague about what you’ll receive and when, ask for specifics. Vagueness at the proposal stage usually means surprises during the build.
What’s the biggest risk to a 12-week MVP timeline?
Scope creep, followed closely by slow decision-making. Both are founder-side risks, not just agency-side. Having a single decision-maker, a frozen feature set after discovery, and a commitment to fast feedback loops does more for timeline discipline than any project management tool.
Whether you're validating an idea, scaling an existing product, or need senior engineering support—We help companies build ideas into apps their customers will love (without the engineering headaches). US leadership with American & Turkish delivery teams you can trust.
Need Developers?
We help companies build ideas into apps their customers will love (without the engineering headaches). US leadership with American & Turkish delivery teams you can trust.
















For Startups & Founders
We've been founders ourselves and know how valuable the right communities, tools, and network can be, especially when bootstrapped. Here are a few that we recommend.

Top 11 Software Development Companies for Small Businesses
Discover the top 11 software development companies helping small businesses grow with custom apps, AI solutions, and expert engineering support.
Read more
Mistakes to Avoid When Building Your First Product
Learn the key mistakes founders make when building their first product—and how to avoid them for a faster, smoother launch.
Read more
The Rise of AI in Product Development: What Startups Need to Know
Learn how AI is transforming product development for startups. From MVPs to scaling, here’s what founders need to know in today’s AI-driven world.
Read more
What is Mixpanel?
Learn how Mixpanel helps startups track user behavior to improve products and accelerate growth with clear data-driven insights.
Read more
How Tawk.to Can Boost Your Startup’s Customer Support Game
Learn how Tawk.to can benefit startups by enhancing customer support and engagement. Perfect for early-stage founders!
Read more
Grow Your Startup With Anthropic's AI-Powered Tools
Discover how Anthropic's cutting-edge AI tools can accelerate your startup's success. Learn about their benefits and see why they can be trusted by startups.
Read more
What is Data-Driven VC?
Learn what a data-driven VC means and how such investors can benefit your startup’s growth and fundraising journey.
Read more
What is Blockchain?
A beginner-friendly guide on blockchain for startup founders, covering key concepts, benefits, challenges, and how to leverage it effectively.
Read more
What is Cybersecurity?
Learn cybersecurity basics tailored for startup founders. Understand key risks, best practices, and how to protect your startup from tech threats.
Read more
What is Seedcamp?
Learn what Seedcamp is, how its European seed fund works, and how founders can use its capital, mentorship, and network to scale their companies.
Read more
What is AngelList?
AngelList is a prime platform connecting startup founders to investors, talent, and resources to accelerate early-stage growth.
Read more
What is 500 Startups?
Learn what 500 Startups (now 500 Global) is, how its accelerator and seed fund work, and when founders should consider it—plus tips for early-stage startups.
Read more.webp)