No items found.

Announcing Zilla and our $4.1M Seed round!

We've launched our open source event-driven API Gateway and closed a seed round! Check out our raison d'être.
Leonid Lukyanov
Team Aklivity

I/O. No matter how sophisticated and complex an app/program/service may be, at the end of the day, it does only one of two things (sometimes both) — take data as an input or produce data as an output. The speed and reliability by which it does this, ultimately defines Digital success.

To enable the seamless delivery of immersive, responsive and scalable experiences at the edge, I’m excited to announce our $4.1M seed round, and the launch of our open source, event-driven API gateway, Zilla. Although Aklivity is less than a year old, and Zilla has just launched, today’s milestones stem from over a decade of my co-founder John’s and my efforts in the real-time networking space.

At Aklivity, we’re grateful for the belief in us and our vision, and are thrilled to have Flo Leibert and Philipp Seifert from 468 Capital, Arash Afrakhteh from Pear VC, Catherine Lu from Alumni Ventures and industry experts including Gus Robertson (NGINX ex-CEO) and Mitch Wainer (Digital Ocean co-founder) in our corner for the journey ahead!

While there’s so much I’d like to tell you about where we’re going, I thought I’d first take a few steps back and share where we’re coming from.

Our raison d'être

In the current age of Digital Transformation, consumer and business expectations are pushing speed and reliability requirements like never before. When a banking app is opened and a deposit is made — the balance is expected to be updated immediately. When a shipping carrier’s truck leaves a dock — the receiver needs to be notified the moment the wheels start turning. When a malicious agent tries to access a system — access must be denied before any inputs are even made. The list of responsive and immersive use cases goes on.  

Fundamentally, behind the exponentially growing scale and urgency of modern initiatives is a seemingly obvious yet profound fact: software, either directly or indirectly, interfaces to the physical, real world. The real world, which operates in real-time. But here’s the caveat, traditional software architectures (REST-based) and the network protocol (HTTP) that ties them together were designed around batched data and a synchronous communication pattern.

These drawbacks have long been recognized, and approaches to add “real-timelines” to deployments in the form of messaging systems and streaming protocols aren’t new; JMS 1.0 was released in 1998 and the concept of an “event-driven architecture (EDA)” was coined at around the same time. However, due to the complexity of creating and managing real-time services, and the “slower” demands of Web 1.0 and early Web 2.0, working with streaming data was either reserved for avant-garde tech companies located within a driving radius of the 94043 Zip Code, or siloed away as part of a larger RESTful deployment.

Today though, the ability to process data in near real-time is increasingly regarded not as a technical achievement, but a necessity across industries. With this, event-streaming is forming the core of modern enterprise backends.

While event streaming has unlocked real-time data processing inside the data center, it also has created new challenges beyond it, especially when it comes to externally facing APIs.

Unlike RESTful deployments where all services, whether frontend or backend, rely on a single type of API, inside EDAs there is a proliferation of APIs and protocols: SSE, MQTT, AMPQ, gRPC, Kafka, WebSocket, etc. Not only are these APIs largely incompatible with existing API management tooling and security practices, which matured around REST APIs and batched data, but actually creating and deploying them is a cumbersome and disjointed effort.

Inside event-driven architectures, services are decoupled, and instead of directly communicating with each other, they must do so through a standalone streaming data layer (most often Apache Kafka). This data layer has its own API, which is not designed for the web. Consequently, for external clients—mobile phones, browsers, partner systems, IoT devices—to reach this layer, a stack of intermediary brokers, web servers and integration frameworks is needed. These middlemen end up disjoining the data path to some of the most valuable consumers and producers of data, which run on the edge.

We founded Aklivity to help unify event-driven architectures and extend them to the outside world. Our goal is to enable teams to deploy native APIs directly on top of event-streams and make developing streaming-based apps and services easy and familiar. To achieve this, we’ve built an API gateway designed from the ground up for streaming, and have called it Zilla!

Zilla natively supports a wide range of networking and messaging protocols, including Kafka. It has a deep understanding of protocol structures and semantics, enabling it to translate between various protocol combinations. Zilla can directly expose Kafka event-streams over protocols and APIs, such as SSE, HTTP, MQTT, AMQP, gRPC and WebSocket. But Zilla is more than just a protocol translator — it’s extending key API management capabilities of existing REST gateways, including authentication, monitoring and cataloging, to streaming APIs.

What’s next

Our new funding will go towards growing the team across a number of different roles to drive innovation and expand Zilla’s open source community. If you’re intrigued by our mission and would like to help shape the future of API connectivity, we’d love to meet you!

If you’re a developer, check out Zilla on Github and follow our Getting Started guide to build stream based Todo app powered by Zilla and Kafka streams.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.