openwhisk-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From James Thomas <jthomas...@gmail.com>
Subject Re: Supporting user-configurable warm action containers?
Date Thu, 31 May 2018 14:29:24 GMT
Speaking to external developers about this, people seem happy to pay for
this feature.

On 31 May 2018 at 12:34, Nick Mitchell <moosevan@gmail.com> wrote:

> for nodejs at least: the cost of a few requires of common packages can
> easily get you up to the 150-200ms range (e.g. request is a big hitter; and
> this is all on top of the cost of starting a container!). perhaps, for
> nodejs at least, there are only a few options, ultimately: user pays more
> for idle resources; provider pays more for idle stem cells; or users take a
> very hard line on the modules they import.
>
> switching to other (compiled) runtimes might help, e.g. with the recent
> work on precompiled go and swift actions? we'd still be left with the
> container start times, but at least this is something we can control, e.g.
> by requiring users to pay more for access to a larger prewarmed pool?
>
> nick
>
>
> On Thu, May 31, 2018 at 7:22 AM, James Thomas <jthomas.uk@gmail.com>
> wrote:
>
> > One of most frequent complaints[1][2][3] I hear from developers using
> > serverless platforms is coping with cold-start latency when dealing with
> > sudden bursts of traffic.
> >
> > Developers often ask for a feature where they can set the number of warm
> > containers kept in the cache for a function. This would allow them to
> keep
> > a higher number of warm containers for applications with bursty traffic
> > and/or upgrade the cached number prior to an anticpated burst of traffic
> > arriving. This would be exposed by the managed platforms as a chargable
> > feature.
> >
> > Is this something we could support on OpenWhisk? Ignoring the complexity
> > and feasibility of any solution, from a developer POV I can image having
> an
> > action annotation `max-warm` which would set the maximum number of warm
> > containers allowed in the cache.
> >
> > Tyson is currently working on concurrent activation processing, which is
> > one approach to reducing cold-start delays[4]. However, there are some
> > downsides to concurrent activations, like no runtime isolation for
> request
> > processing, which might make this feature inappropraite for some users.
> >
> > [1]
> > https://www.reddit.com/r/aws/comments/6w1hip/how_many_
> > successive_lambda_invocations_will_use_a/
> > [2]
> > https://twitter.com/search?f=tweets&vertical=default&q=%20%
> > 23AWSWishlist%20warm&src=typd
> > [3]
> > https://theburningmonk.com/2018/01/im-afraid-youre-
> > thinking-about-aws-lambda-cold-starts-all-wrong/
> > [4] - https://github.com/apache/incubator-openwhisk/pull/2795
> >
> > --
> > Regards,
> > James Thomas
> >
>



-- 
Regards,
James Thomas

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message