Skip to content

Five AWS Lambda Anti-Patterns TypeScript Developers Bring From Monoliths

DI containers, monolithic SDKs, god-handlers, top-level secret fetches, and heavy ORMs - what they cost on cold start, and the functional shape that replaces them.

Problem

Teams move from NestJS, Spring, or .NET monoliths to AWS Lambda and bring patterns that work in long-running services. Bundles bloat, cold starts grow, and the platform's economics fight back. Lambda is a function, not a microservice - monolith OO/DI patterns inflate bundles and tank cold starts.

Five habits cause most of the damage: DI containers, non-modular AWS SDK imports, god-handler service classes, synchronous secret fetches at module top, and heavy ORMs. Each section below names the symptom, explains the cost, and gives the functional alternative.

The clearest tension is dependency injection. NestJS serverless docs present @nestjs/platform-fastify plus serverless-http as a supported pattern. The AWS Lambda Best Practices doc warns against frameworks that inflate the deployment package and recommends lean handlers. Both sources are authoritative; the cost asymmetry is not. A DI container in a long-running service amortises across millions of requests. In Lambda, every cold container pays the construction cost from scratch.

DI Containers (NestJS, tsyringe, InversifyJS)

What it looks like

A handler decorated with @Injectable, a module that wires repositories, services, and providers, and reflect-metadata imported at the top of every entry point. The handler resolves a controller through the container before the first line of business logic runs.

Why it hurts

reflect-metadata ships to every cold start. Decorator metadata is generated at startup. The container resolves the entire graph eagerly even when the handler only needs one dependency. Bundle size grows by the cost of every provider in the module, not just the one this handler uses. esbuild cannot tree-shake constructors that the container will call.

Functional alternative

Export a plain async function. Construct dependencies lazily inside a module-scoped variable, initialised on first invocation, reused across warm invocations.

ts
// good: lazy singleton, no containerimport { S3Client } from "@aws-sdk/client-s3";import type { S3Event } from "aws-lambda";
let s3: S3Client | undefined;const getS3 = () => (s3 ??= new S3Client({}));
export const handler = async (event: S3Event) => {  const client = getS3();  // use client};

The ??= operator initialises on first call and reuses the singleton across warm invocations. Tests pass dependencies as function arguments, which is simpler than mocking a container.

Non-Modular AWS SDK Imports

What it looks like

import * as AWS from 'aws-sdk' - the v2 monolithic SDK, which entered maintenance mode in September 2024 and reached end-of-support on 2025-09-08. Current Node.js Lambda runtimes bundle @aws-sdk/* v3 instead.

Why it hurts

The v2 SDK is one large bundle covering every AWS service. esbuild can tree-shake some of it, but the public surface and shared internals keep the floor high. v3 is split into per-service packages with middleware imported separately. The difference shows up in both bundle size and INIT duration.

Functional alternative

Import only the client and commands you use. Stay on @aws-sdk/client-* packages. Bundle with esbuild, mark the runtime-bundled SDK as external, and let tree-shaking do the rest.

ts
import { S3Client, GetObjectCommand } from "@aws-sdk/client-s3";
let s3: S3Client | undefined;const getS3 = () => (s3 ??= new S3Client({}));
export const handler = async (event: { bucket: string; key: string }) => {  const client = getS3();  const result = await client.send(    new GetObjectCommand({ Bucket: event.bucket, Key: event.key }),  );  return result.ContentLength;};

The build command marks the runtime-provided SDK as external so it is not duplicated in the bundle:

bash
esbuild src/handler.ts \  --bundle \  --platform=node \  --target=node24 \  --external:@aws-sdk/* \  --minify \  --outfile=dist/handler.js

Verify with each runtime release that @aws-sdk/* is still bundled in the runtime; AWS publishes the included versions per runtime release.

God-Handler Service Classes

What it looks like

The handler delegates to OrderService, which uses OrderRepository, which depends on Logger, MetricsClient, and a DatabasePool. All five are instantiated up front because that is how the monolith does it. The handler file imports a single class and calls one method on it.

Why it hurts

Every dependency in the graph runs its constructor on cold start. Memory holds them across the function lifetime even when the handler exits early. Tree-shaking cannot eliminate constructors the runtime will call. Worse, when one of those classes pulls in a heavy transitive dependency, the bundle inherits it.

Functional alternative

One handler does one thing. Cross-cutting concerns - logging, tracing, validation - move to middleware via Lambda Powertools or Middy. Shared logic moves to pure functions imported per handler, not classes hung off a graph the container has to walk.

ts
import { Logger } from "@aws-lambda-powertools/logger";import middy from "@middy/core";import jsonBodyParser from "@middy/http-json-body-parser";
const logger = new Logger({ serviceName: "orders" });
const baseHandler = async (event: { body: { id: string } }) => {  logger.info("processing order", { id: event.body.id });  // exactly one thing  return { statusCode: 200, body: JSON.stringify({ ok: true }) };};
export const handler = middy(baseHandler).use(jsonBodyParser());

Middy composes middleware as plain functions; Powertools utilities are tree-shakeable when imported by name. Either path keeps the handler small.

Synchronous Secret/SSM Fetch at Module Top

What it looks like

ts
// bad: blocks every cold startimport { SSMClient, GetParameterCommand } from "@aws-sdk/client-ssm";import { Pool } from "pg";
const ssm = new SSMClient({});const secrets = await ssm.send(  new GetParameterCommand({ Name: "/db/url", WithDecryption: true }),);const db = new Pool({ connectionString: secrets.Parameter!.Value });
export const handler = async () => {  /* ... */};

Why it hurts

Top-level await runs during the INIT phase, which is part of cold start. Every cold container pays the round-trip to SSM or Secrets Manager before the handler starts. If the network is slow or the parameter store is rate-limited, billed duration grows. INIT failures do not retry the same way handler failures do.

Functional alternative

Lazy-init on first invocation. Cache the result in a module-scoped variable. For hot paths, the AWS Parameters and Secrets Lambda Extension provides in-memory caching with TTL behind a localhost HTTP endpoint.

ts
import { SSMClient, GetParameterCommand } from "@aws-sdk/client-ssm";import { Pool } from "pg";
let pool: Pool | undefined;
const getPool = async () => {  if (pool) return pool;  const ssm = new SSMClient({});  const result = await ssm.send(    new GetParameterCommand({ Name: "/db/url", WithDecryption: true }),  );  pool = new Pool({ connectionString: result.Parameter!.Value });  return pool;};
export const handler = async () => {  const db = await getPool();  // use db};

The first invocation pays the round-trip; warm invocations skip it. With the Parameters and Secrets extension layer attached, the lookup hits a local cache and the round-trip moves out of your handler entirely.

Heavy ORMs (Prisma, TypeORM, Mongoose)

What it looks like

Prisma client imported at module top. prisma generate baked into the build. The query engine binary lands in the deployment package. TypeORM with metadata reflection has the same shape as a DI container - eager graph build at startup. Mongoose calls mongoose.connect(...) plus mongoose.model('Order', schema) at module load, compiling every schema in the bundle whether the handler uses it or not.

Why it hurts

Prisma's query engine is a separate binary that the client links against. Bundle size grows by megabytes that the runtime's @aws-sdk/* external trick cannot undo. Cold start grows because the engine initialises before the first query. TypeORM's reflection step blocks INIT for the same reason reflect-metadata does in DI containers. Mongoose ships the MongoDB driver and BSON parser bundled with its model registry; schema compilation and plugin registration run during module load on every cold start, and a top-level mongoose.connect() adds the same INIT-phase blocking that anti-pattern four warns about.

Functional alternative

For DynamoDB-shaped work, the v3 SDK's DynamoDBDocumentClient is enough. For SQL, a thin query builder like Kysely ships small and lazy-loads. For MongoDB, the official mongodb driver alone is a few hundred kilobytes; pair it with a small validator like Zod for the schema layer Mongoose otherwise provides, and lazy-init the connection inside a module-scoped variable rather than at module top. Reserve full ORMs and ODMs for long-running services where the engine cost amortises.

ts
import { Kysely, PostgresDialect } from "kysely";import { Pool } from "pg";
interface Database {  orders: { id: string; total: number };}
let kysely: Kysely<Database> | undefined;
const getDb = () => {  return (kysely ??= new Kysely<Database>({    dialect: new PostgresDialect({      pool: new Pool({ connectionString: process.env.DB_URL }),    }),  }));};
export const handler = async (event: { id: string }) => {  const db = getDb();  return db    .selectFrom("orders")    .selectAll()    .where("id", "=", event.id)    .executeTakeFirst();};

Hand-written SQL or a small builder is acceptable when each function touches one to three tables. When the surface grows beyond that, the design signal is to ask whether this work belongs in Lambda at all.

When the Default Holds and When to Override

Branches name override cases. Middy or Powertools is an addition to the default, not a replacement for it. A handler with two or three tightly related operations on the same resource can share a small router; eager splitting creates cold-start surface elsewhere. The override that does not fit this default is a small set of CRUD endpoints with shared validation - lambdalith with Powertools is reasonable there. The default is still the lean function.

How to Measure

Two numbers tell you whether the lean shape is working: bundle size after esbuild, and INIT duration from CloudWatch.

Bundle size shows up in the esbuild output line. Run the build and read the byte count:

bash
pnpm exec esbuild src/handler.ts \  --bundle \  --platform=node \  --target=node24 \  --external:@aws-sdk/* \  --minify \  --outfile=dist/handler.js

Aim under 1 MB for a single-purpose handler. A tree-shaken v3 client plus typical business logic lands well below that.

INIT duration shows up in the REPORT line that CloudWatch emits at the end of every cold invocation:

REPORT RequestId: 8f5e... Duration: 12.34 ms Billed Duration: 13 ms Memory Size: 512 MB Max Memory Used: 67 MB Init Duration: 187.42 ms

Aim under 200 ms for Node.js. p99 cold start is reported separately in Lambda Insights and X-Ray; track it as a distinct percentile from warm p99. Public studies (Lumigo, Datadog, AWS Compute Blog) consistently show that bundle size and module-top work dominate Node.js cold start, in that order.

Closing

Lambda rewards lean. Treat each handler as a single typed function with lazy dependencies and modular imports. Reach for middleware (Powertools, Middy) when cross-cutting concerns appear; reach for a query builder when SQL becomes unavoidable. Reserve DI containers and heavy ORMs for the long-running services they were designed for.

The boundary worth naming: this advice fits the function-per-route default. A small CRUD lambdalith with Powertools is a reasonable override; a service that needs a full domain model and a graph of repositories is the signal that you wanted ECS or App Runner, not Lambda.

References

Related Posts