Skip to content

Lambda execution: Why use now-bridge with local http-server instead of just calling the render function? #231

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
adroste opened this issue Nov 1, 2021 · 7 comments
Labels
question Question about usage of the library

Comments

@adroste
Copy link

adroste commented Nov 1, 2021

First of all, I'm just here because tf-next build enables me to actually use my next-app in a manually configured environment using CloudFormation/SAM. Good job 👍 .

I want to understand why you made the decision to implement the launcher and bridge (the runtime of the lambda proxy) in a way that you create a local http server and call it directly from inside the code. It seems to me that this just creates a massive unnecessary overhead.
The only thing that is needed is the routing stuff from the generated now__launcher.js.
The render function can be called directly with mocked req/res objects:

const mod = require('./.next/serverless/pages/index.js')
mod.render(
    { headers: {}, url: 'http://127.0.0.1/' },
    {
        setHeader: v => { console.log('next:setHeader', v)},
        getHeader: v => { console.log('next:getHeader', v); return undefined; }, 
        hasHeader: v => { console.log('next:hasHeader', v); return false; }, 
        end: console.dir
    }
);
@ofhouse
Copy link
Member

ofhouse commented Nov 1, 2021

Hi,
the launcher & bridge was something we simply copied from the Vercel repo when their deployment process for Next.js was still open source.
Vercel has chosen this approach for their platform to achieve a maximum of compatibility with traditional Node.js server environments.

While there are existing mocking solutions for request & response around (e.g. node-mocks-http), the objects that Node.js creates for a request in practice are quite complex and difficult to mock.
I would guess that starting a Node.js server and making a request is just good enough in terms of performance instead of using a mocking solution.

Since the server is frozen by AWS Lambda between calls, the server is also only generated and started on a Lambda cold start.
So for every following request to the same Lambda the performance is equal to a running Node.js server.

The render function that is currently packed into the launcher is also something that is already deprecated in Next.js 12.
Future deployments to AWS Lambda will use a full Next.js server, instead of creating an abstraction for serverless environments.

@ofhouse ofhouse added the question Question about usage of the library label Nov 1, 2021
@adroste
Copy link
Author

adroste commented Nov 1, 2021

"full Next.js server" means that you will use the Next.js custom server api?

It also seems that @dphang from the serverless-next.js repo is working on a simple compat layer for normal lambdas right now:
https://github.com/serverless-nextjs/serverless-next.js/tree/master/packages/libs/lambda (last commit 30 mins ago)

@ofhouse
Copy link
Member

ofhouse commented Nov 1, 2021

Not exactly the Next.js custom server but the core of it (called next-server), yes.

I have an ongoing PR that implements it: #89
The main part of it is a new launcher that uses the next-server package: https://github.com/milliHQ/terraform-aws-next-js/blob/63d6547c85e5d8e381b4072c63607d22c11ae4fc/packages/runtime/src/server-launcher.ts

Yes, sls-next is currently in the process moving its Lambdas from Lambda@Edge to regional Lambdas to follow a similar architecture as ours (and Vercel's).
However it seems that they try to implement the routing logic into API-Gateway while we use Lambda@Edge for it.

It's kind of crazy that each community Next.js serverless project creates their own builder (this project, sls-next, @netlify, Amplify, Flightcontrol.dev) given that we all deploy to AWS Lambda eventually 😅

@adroste
Copy link
Author

adroste commented Nov 1, 2021

I have an ongoing PR that implements it: #89

👍 , any roadmap/eta?

It's kind of crazy that each community Next.js serverless project creates their own builder

You're absolutely right. Kind of a bummer that vercel doesn't provide the full build tools for us. Leaves a bittersweet taste given the fact that Next.js is promoted as "open-source". Nevertheless, I also understand that it's their business model to kind of force people to use their hosting platform.

@dphang
Copy link

dphang commented Nov 1, 2021

It also seems that @dphang from the serverless-next.js repo is working on a simple compat layer for normal lambdas right now: https://github.com/serverless-nextjs/serverless-next.js/tree/master/packages/libs/lambda (last commit 30 mins ago)

Hey all, just chiming in since I was mentioned - definitely an interesting project here that I didn't know about yet.

Yes, that is right, the build/packaging logic is actually mostly done (thanks in large part to Jan Varho for the initial work) but I'm currently working on testing deployment logic using CDK for Terraform and ironing bugs out.

It's kind of crazy that each community Next.js serverless project creates their own builder (this project, sls-next, @netlify, Amplify, Flightcontrol.dev) given that we all deploy to AWS Lambda eventually 😅

Yeah, that is definitely a lot of duplication, I wish we could collaborate more across projects. Though for serverless-next.js, I guess the original author made it very coupled/custom logic for Lambda@Edge to optimize for serverless platforms instead of trying to use parts of next-server. I took over and am trying to make the core routing logic more generic and able to support multiple platforms in the future (Lambda@Edge, Lambda, Azure, GCP etc.) while optimizing for serverless platforms (cold starts especially). Unfortunately it's also hard to keep up with Next.js parity since we are emulating the routing/handling logic nearly from scratch (except parts like reusing image optimizer and some build tools) and it's not the cleanest code (and especially since I'm the only one working on it now haha. I've been working on cleaning things up and making it easier to update the core logic - just been a slow process). I am thinking of also moving to CDK instead of Serverless Components, as keeping up with the logic is hard since it's fragile and less configurable.

In fact https://github.com/serverless-stack/serverless-stack does use our @sls-next/lambda-at-edge builder but with their own CDK constructs...

One thing I've been focused on is also try to simplify the architecture and improve perf - one Lambda for routing, getting data from S3, etc. Previously, for Lambda@Edge to do fallbacks, add headers it needs an origin response handler which was CloudFront specific and made development more complex. And these Lambdas actually just wrap generic handlers with a compat layer and platform-specific logic to retrieve files (so we can reuse for Azure, GCP, etc.) So it can be either:

CloudFront -> Lambda@Edge origin request handler -> S3/SQS or SSR rendering
CloudFront -> APIGateway V2 -> Regional Lambda -> S3/SQS or SSR rendering

You're absolutely right. Kind of a bummer that vercel doesn't provide the full build tools for us. Leaves a bittersweet taste given the fact that Next.js is promoted as "open-source". Nevertheless, I also understand that it's their business model to kind of force people to use their hosting platform.

Yeah, I definitely understand too since their business model is hosting on Vercel, but considering Next.js is open source I believe we should also have bunch of open source self-hosted options (AWS, etc.), hence why I continued on maintaining serverless-next.js after the original maintainer left...

@ofhouse
Copy link
Member

ofhouse commented Nov 2, 2021

👍 , any roadmap/eta?

Plan is to ship it together with ISR mode in the 0.11 milestone by end of November.

Hey all, just chiming in since I was mentioned - definitely an interesting project here that I didn't know about yet.

Always welcome! Nice to hear your thoughts on this topic, thanks for sharing! 👍

It think Vercel has learned some lessons from the Gatsby story: They invested a lot of time (of their then small team) to improve the open source Markdown parsing / MDX projects and then Gatsby came in, put a GraphQL API on top of MDX and scored their first big VC money from it.
So their engagement in open source there ultimately created the first real competitor to Next.js, which gave them a hard time.

However since they pulled the deployment code from the public repo they always worked towards removing special Vercel code from Next.js (which resulted in the deprecation of the serverless target mode in v12).

However it's still hard to keep up-to-date with the latest Next.js features since most of the changes of internal Next.js behaviors are not public (or hidden in patch releases), so reverse engineering for new features takes a lot of time.

Unfortunately it's also hard to keep up with Next.js parity since we are emulating the routing/handling logic nearly from scratch (except parts like reusing image optimizer and some build tools)

That might be some points where working together would make sense:

  • For the image optimization we developed a small wrapper around Next.js that separates the image optimizer from the rest of Next.js.
    There is already a request (https://github.com/milliHQ/terraform-aws-next-js-image-optimization/issues/58) to make the wrapper available as Express middleware, so we are currently thinking about publishing the wrapper as dedicated npm module with additional adapters (For Lambda, Express, etc).
  • Our router (called proxy) is also an unopinionated module that simply mimics the routing of Vercel's edge network and could be reused in multiple projects.

@ofhouse
Copy link
Member

ofhouse commented Apr 15, 2022

I'm going to close this issue because it has been inactive for 30 days ⏳. This helps to find and focus on the active issues.

@ofhouse ofhouse closed this as completed Apr 15, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Question about usage of the library
Projects
None yet
Development

No branches or pull requests

3 participants