Systems | Development | Analytics | API | Testing

Open Source

Show and Tell: Discover the Benefits SmartBear's Open-Source Tools Bring to the SDLC

Open-source tooling can be incredibly helpful in the software development lifecycle (SDLC). They offer flexibility, a wide range of features, and often community support too. Watch this session to explore SmartBear’s industry-leading open-source tools (Swagger, SoapUI, Pact, and more) on how to streamline your development process to create a more tailored developer experience.

Announcing Standard Webhooks

We're pleased to announce the launch of Standard Webhooks! Kong has been part of the Technical Committee of this standard with other great companies like Svix (the initiator of the project), Ngrok, Zapier, Twillio, Lob, Mux, and Supabase. This was a year-long effort of gathering feedback, use cases, and debating about what and how to define what landed. Standard Webhooks is one initiative to standardize the way producers and consumers can have a contract to communicate.

Hacktoberfest #1 - Recognize, Celebrate and Connect open source Developers.

If you’ve ever wanted to contribute to open source, now is your chance! You can contribute to several Keploy projects participating this year’s Hacktoberfest. Anyone around the globe who desires to help drive the growth of open source and make positive contributions to an ever-growing community. All backgrounds and skill levels are encouraged to participate. Vaunt builds vibrant open-source communities by rewarding developers for contributions, celebrating achievements, and encouraging connections.

OpenLogic by Perforce, In Collaboration With the Eclipse Foundation and Open Source Initiative, Launches the Next State of Open Source Survey

Perforce Software launches its latest survey on the usage of open source software. This survey marks the first year of collaboration with the Eclipse Foundation in addition to the Open Source Initiative (OSI), who is participating on the survey for a third year.

Introducing Confluent Cloud for Apache Flink

In the first three parts of our Inside Flink blog series, we discussed the benefits of stream processing, explored why developers are choosing Apache Flink® for a variety of stream processing use cases, and took a deep dive into Flink's SQL API. In this post, we'll focus on how we’ve re-architected Flink as a cloud-native service on Confluent Cloud. However, before we get into the specifics, there is exciting news to share.

Mission-critical data flows with the open-source Lenses Kafka Connector for Amazon S3

An effective data platform thrives on solid data integration, and for Kafka, S3 data flows are paramount. Data engineers often grapple with diverse data requests related to S3. Enter Lenses. By partnering with major enterprises, we've levelled up our S3 connector, making it the market's leading choice. We've also incorporated it into our Lenses 5.3 release, boosting Kafka topic backup/restore.

Apache Kafka Message Compression

Apache Kafka® supports incredibly high throughput. It’s been known for feats like supporting 20 million orders per hour to get COVID tests out to US citizens during the pandemic. Kafka's approach to partitioning topics helps achieve this level of scalability. Topic partitions are the main "unit of parallelism" in Kafka. What’s a unit of parallelism? It’s like having multiple cashiers in the same store instead of one.

Dataflow Programming with Apache Flink and Apache Kafka

Recently, I got my hands dirty working with Apache Flink®. The experience was a little overwhelming. I have spent years working with streaming technologies but Flink was new to me and the resources online were rarely what I needed. Thankfully, I had access to some of the best Flink experts in the business to provide me with first-class advice, but not everyone has access to an expert when they need one.