For every request, IronFunctions would spin up a new container to handle the job, which depending on container and task could add a couple of 100ms of overhead.

So why not reuse the containers if possible? Well that is exactly what Hot Functions do.

Hot Functions improve IronFunctions throughput by 8x (depending on duration of task).

Hot Functions reside in long-lived containers addressing the same type of task, which take incoming workload and feed into their standard input and read from their standard output. In addition, permanent network connections are reused.

Here is how a hot function looks like. Currently, IronFunctions implements a HTTP-like protocol to operate hot containers, but instead of communication through a TCP/IP port, it uses standard input/output.

So to test this baby we deployed on 1 GB Digital Ocean instances (which is not much), and used Honeycomb to track and plot the performance.


Simple function printing "Hello World" called for 10s (MAX CONCURRENCY = 1).
Alt Text

Hot Functions have 162x higher throughput.


Complex function pulling image and md5 checksumming called for 10s (MAX CONCURRENCY = 1).
Hot Functions have 1,39x higher throughput.


By combining Hot Functions with concurrency we saw even better results:

Complex function pulling image and md5 checksumming called for 10s (MAX CONCURRENCY = 7)
Alt Text
Hot Functions have 7,84x higher throughput.


So there you have it, pure awesomeness by the Iron.io team in the making.

Also a big thank you to the good people from Honeycomb for their awesome product that allowed us to benchmark and plot (All the screenshots in this article are from Honeycomb). Its a great and fast new tool for debugging complex systems by combining the speed and simplicity of time series metrics with the raw accuracy and context of log aggregators.

Since it supports answering arbitrary, ad-hoc questions about those systems in real time, it was an awesome, flexible, powerful way for us to test IronFunctions!