Hey @vercel fans (and @rauchg), just noticed that I can now deploy serverless Python functions in Vercel that exceed 50M. I could have sworn that as early as a couple of weeks ago, I got an error when I exceeded the 50M. Did I miss something? Is this now officially supported? If so, this made my week.
Did a ChatGPT deep research query and evidently as of mid March 2025, Vercel officially supports going up to the 250M (uncompressed) limit and has lifted the 50M (compressed) limit. My guess is that they're using S3 to get the deploy to AWS Lambda.
13,52K