The rise of type-safe TypeScript applications
TypeScript was once the thing you'd add later. Today, it's where frameworks compete. How did we get here, and what does first-class type safety actually look like in practice?
Contents
A few years ago, the conversation around JavaScript frameworks was almost entirely about features. Does it have a good router? What about server-side rendering? How does it handle state? TypeScript support existed somewhere on the checklist, but it was rarely a dealbreaker. If a library shipped @types/* definitions (even community-maintained ones) that was usually good enough.
Things have changed. The features war is largely settled, type safety has moved from optional extra to core design principle, and the tools leading the way deserve a closer look.
When TypeScript was a checkbox

Think back to the mid-2010s. TypeScript was gaining traction, but the ecosystem was always playing catch-up: type definitions lived in DefinitelyTyped, maintained by the community rather than library authors, often incomplete or constantly lagging behind. Using TypeScript with most popular libraries meant accepting that the type layer was never quite accurate.
The framework conversation was elsewhere entirely. Teams picked React for its component model, Angular for its opinionated structure, Vue for its gentler learning curve. These were meaningful differences. Choosing the wrong one had real consequences for productivity and hiring. TypeScript was something you should add, but the framework’s feature set was the real decision driver.
In that context, it was also harder to explain what TypeScript was actually for. “Catch errors at compile time” sounds great in theory, but when half your dependencies lack accurate types, you’re still writing as any in enough places that the safety net has visible holes.
The great convergence
Something has shifted over the past few years. Look at the modern JavaScript landscape and you’ll see that the frameworks have largely caught up with each other. Server-side rendering, static generation, file-based routing, incremental static regeneration, edge deployments: the feature gap between React, Vue, Svelte, Solid, and Astro has narrowed dramatically. Performance differences, once a genuine differentiator, are now marginal for the vast majority of applications (unless a RAM shortage changes things in the near future 😄).
This isn’t a bad thing. It’s a sign of a maturing ecosystem. The problems the first generation of frameworks set out to solve have, largely, been solved. But it does raise a new question: if the features are essentially equivalent, what do we choose and why?
The answer is increasingly developer experience. And in the JavaScript world, DX is inseparable from TypeScript.
Developer experience is the next battleground

“Developer experience” is a term that can mean everything and nothing at the same time. So let’s be specific about what it means here: autocomplete that reflects your actual data shapes, compile-time errors that catch bugs before they reach production, the ability to refactor with confidence, and APIs that are self-documenting because the types are the documentation.
This matters more than it ever has, and the reason is worth examining. The rise of AI coding assistants has fundamentally changed the feedback loop of writing software. These tools are really useful, but they are significantly more useful when the codebase has strong types. Types act as boundaries: they tell the model what’s possible and what isn’t, they make the output easy to check, and they catch the small mistakes that LLMs regularly make when guessing the structure from context. In a typed codebase, a wrong suggestion from an AI assistant fails loudly at compile time. In an untyped one, it fails silently at runtime.
Good TypeScript isn’t just a quality-of-life improvement anymore. It’s becoming a productivity multiplier, especially for teams incorporating AI into their workflow.
The tools that got it right
A new generation of libraries has made type safety their core design principle, not something layered on top after the fact. Four of them illustrate the shift particularly well.
Zod is a TypeScript-first schema validation library. What makes it interesting isn’t just that it validates data at runtime. class-validator and joi have done that for years. It’s that a Zod schema is also a type definition. You define the shape once; Zod infers the TypeScript type automatically.
import { z } from "zod"
const UserSchema = z.object({
id: z.number(),
email: z.string().email(),
role: z.enum(["admin", "user"]),
})
type User = z.infer<typeof UserSchema>
// { id: number; email: string; role: "admin" | "user" }
const parsed = UserSchema.parse(req.body)
// validated at runtime, fully typed at compile time. No duplication.
The z.infer<typeof UserSchema> pattern gets rid of a common problem: no more keeping a runtime validator and a TypeScript interface in sync. The schema is the type. This sounds like a small thing until you’ve dealt with a validator and an interface that have slowly drifted apart over six months of feature work.
Prisma takes the same idea and applies it to database access. The Prisma schema defines your data model; from it, Prisma generates a fully typed client. Every query is checked at compile time against the actual shape of your database.
const user = await prisma.user.findFirst({
where: { email: "[email protected]" },
select: { id: true, name: true, posts: { select: { title: true } } },
})
// user: { id: number; name: string; posts: { title: string }[] } | null
Try selecting a field that doesn’t exist in the schema and you get a compile error, not a runtime exception. Try passing an invalid filter type and TypeScript stops you before the query ever runs. The database schema becomes a contract that flows through your entire application.
tRPC addresses the boundary that has traditionally been the weakest point in full-stack TypeScript applications: the API layer. REST APIs require either maintaining an OpenAPI spec and running codegen, or accepting that the client and server types are never actually verified against each other. tRPC removes that friction entirely.
// server
const appRouter = router({
user: procedure
.input(z.object({ id: z.number() }))
.query(({ input }) => prisma.user.findUnique({ where: { id: input.id } })),
})
// client: no codegen, no OpenAPI spec, no manual type imports
const user = await trpc.user.query({ id: 1 })
// typed as the return value of the query resolver
The types flow from the server router definition to the client automatically. Rename a field on the server and the client breaks at compile time. Change the input schema and any caller passing the wrong shape fails before the code runs. The API contract is enforced by TypeScript itself.
TanStack Router brings the same philosophy to client-side routing, an area that has historically been ignored when it comes to type safety. Routes are defined in a way that makes the router aware of the expected params, search parameters, and loader data for every route. Wrong params become compile errors rather than runtime surprises.
export const Route = createFileRoute("/users/$userId")({
loader: ({ params }) => fetchUser(params.userId),
component: function UserPage() {
const user = Route.useLoaderData()
// fully typed as the return value of fetchUser
return <h1>{user.name}</h1>
},
})
What the TanStack suite demonstrates collectively (Router, Query, Table, Form) is a coherent vision of what a type-safe frontend looks like end-to-end: data fetching, routing, UI state, and form validation all wired through types.
The runtimes are catching up

The shift isn’t limited to libraries and frameworks. The JavaScript runtimes themselves have started executing .ts files directly. Deno shipped native TypeScript support at launch in 2020; Bun did the same when it arrived in 2023. Both treat it as a core feature rather than an add-on. Node.js followed: the --experimental-strip-types flag landed in v22.6.0 (August 2024), and by v23.6.0 (January 2025) it was unflagged — node file.ts just works, with no build step required. The approach is deliberate type erasure rather than full compilation: annotations are stripped, enums and namespaces are out of scope, and tsconfig.json is ignored.
At the spec level, the TC39 Type Annotations proposal goes one step further, suggesting that JavaScript engines themselves should accept type annotation syntax and treat it as comments. It’s Stage 1 and has been quiet since 2023, so it’s more of a signal about direction than a near-term deliverable. But the fact that it’s on the table at all reflects how far the consensus has shifted.
The most significant signal yet may be what’s happening to the compiler itself. In March 2025, Anders Hejlsberg announced a native port of the TypeScript compiler written in Go, targeting a 10x reduction in build times and shipping as TypeScript 7.0. A preview is already available as @typescript/native-preview. Rewriting a compiler of this complexity signals that performance is no longer a polish item — it’s core to the value proposition, especially as AI tooling demands tighter latency across ever-larger codebases.
Where this is heading
The pattern across these tools is consistent: define the shape once, let types propagate everywhere. Schema to runtime validation (Zod). Database model to query client (Prisma). Server router to client call (tRPC). Route definition to component props (TanStack). The individual tools are compelling; assembled together, they form something more significant: an end-to-end type graph running from the database to the UI.
This trend will keep going. Frameworks and libraries that treat TypeScript as a first-class citizen from the start will gain adoption; those that add types on top of an untyped core will keep running into the same problems. The distance between “TypeScript works” and “TypeScript is the design” is exactly where developer experience is won or lost.
It’s worth adding a note of caution here though. There’s a real risk, as with any trend gaining momentum, of adopting these tools because they’re current rather than because you understand what they’re solving. Wrapping everything in Zod schemas and tRPC routers without a clear sense of why won’t automatically produce a better codebase. The goal is to be confident end-to-end: knowing that a change in one part of the code will show its effects in other parts before you deploy. That’s what’s actually worth going after.
Conclusion
TypeScript has moved past the “nice to have” phase. With frameworks converging on features and AI assistants becoming a standard part of the development workflow, the quality of a project’s type coverage has become something that actually sets projects apart. Tools like Zod, Prisma, tRPC, and TanStack aren’t just good libraries: they show what TypeScript-first software can look like when done well. If you’re starting something new today, type safety shouldn’t be the last thing you think about. It should be one of the things you choose based on.