Systems | Development | Analytics | API | Testing

Serverless

Which Cloud Database Platform to Choose for Your Applications

If your application needs persistent data, you are going to need a database. Easier said than done because managing a database can be a challenge. Not only do you need to set up, maintain, scale, and patch databases, but also you need to create strong backup policies, ensure sharding and replications. Long story short, managing a database is time-consuming and requires a dedicated and skilled team, which is why people opt for managed databases, sometimes referred to as DBaaS or Database as a Service.

Making Flink Serverless, With Queries for Less Than a Penny

Imagine easily enriching data streams and building stream processing applications in the cloud, without worrying about capacity planning, infrastructure and runtime upgrades, or performance monitoring. That's where our serverless Apache Flink® service comes in, as announced at this year’s Current | The Next Generation of Kafka Summit.

Serverless Postgres Public Preview

We're excited to announce the Koyeb Serverless Postgres public preview - a fully managed, fault-tolerant, and scalable serverless Postgres Database Service. What do all modern applications have in common? They all have APIs, workers, and databases. Deploying APIs and workers with Koyeb has long been possible. Starting today, you can spin up databases too! Using Koyeb Serverless Postgres, you can easily start a resilient Database Service alongside your apps in a few seconds.

We raised $7M to Simplify App Deployment with our Global Serverless Platform

We are thrilled to share that we’ve raised $7M in seed funding! At Koyeb, we simplify app deployment with our global serverless platform. We provide an easy way to deploy full-stack applications and databases in production, everywhere, in minutes. We’re focused on allowing developers and businesses to seamlessly build, run, and scale any application globally, with no code rewrite or infrastructure management.

Sustaining free compute in a hostile environment

One year ago, Heroku sunsetted its free tier. Today, we want to reaffirm our commitment to maintaining our free tier, dive into why offering a free tier for compute is complicated (we are looking at you crypto miners), take the time to explain how we intend to sustain it, and explain why we are so committed to providing a free tier. Long story short: we aim to keep a free tier thanks to how we control our costs.

Building a global deployment platform is hard, here is why

If you ever tried to go global, you have probably faced a reality check. A whole new set of issues starts to appear when you start to operate a workload over multiple locations across the globe: So it looks like a great idea in theory, but in practice, all of this complexity multiplies the number of failure scenarios to consider!

API Gateway and Service Mesh: Bridging the Gap Between API Management and Zero-Trust Architecture

Discover how API management and service mesh can go hand in hand toward secured platforms Over the last ten years, Kongers have witnessed hundreds of companies adopting a full lifecycle API management platform and have been working with the people behind the scenes, the “API tribes.” We’ve also learned from the field that API tribes most often have to deal with heterogeneous platforms, infrastructures, and clouds.

The Global Deployment Engine: How We Deploy Across Continents

We previously explored how we built our own Serverless Engine and a multi-region networking layer based on Nomad, Firecracker, and Kuma. But what about the architecture of the engine that orchestrates these components across the world? This is an interesting topic to work on and we thought it could be useful to share some internals out there. Put on your scuba equipment, this is a deep dive into our architecture and the story of how we built our own global deployment engine.