Systems | Development | Analytics | API | Testing

Snowflake Announces Intent To Acquire Streamlit

Hear from Snowflake co-founder and President of Products Benoit Dageville and AdrienTreuille co-founder and CEO of Streamlit as they discuss what this strategic acquisition means to developers and data scientists. The two companies will join forces to unlock the unrealized potential of data by making it easier to build beautiful applications using tools they love with simplified data access and governance.

4 Types and 20 Examples of Survey Questions to Ask Your Customers and Improve Your App

You have dedicated tons of man-hours to building your product and strengthening your service. You are trying to ensure that your application is useful but, above all, that it succeeds at making users come back for more. But then, *gasp*, they don’t! Why? What went wrong? How can you get them to come back?

What is data integration (with 5 use cases)

Data integration is the data engineering process of combining data from disparate sources into a single unified view of the data. The process begins with data ingestion from different source systems. This includes data extraction from disparate sources, data transformations or cleaning, and loading the data into a single repository - anything from Excel data sets to Enterprise data stores.

Announcing the Unravel Winter Release

Today, we’re excited to announce the Unravel Winter Release ! This winter release introduces major enhancements and improvements across the platform, including comprehensive cost management for Databricks, support for Delta Lake on Databricks, data observability for Google BigQuery, interactive pre-check before installation and upgrade.

What is AUR and How Can It Help You Power Your E-Commerce Business?

What is AUR? The five most important things to know are: E-commerce stores such as Amazon have revolutionized the field of online shopping. But whether you're a massive online store or a small business owner, you need to keep track of your company's metrics and KPIs (key performance indicators) to evaluate the success of your business model.

Producing Protobuf data to Kafka

Until recently, teams were building a small handful of Kafka streaming applications. They were usually associated with Big Data workloads (analytics, data science etc.), and data serialization would typically be in AVRO or JSON. Now a wider set of engineering teams are building entire software products with microservices decoupled through Kafka. Many teams have adopted Google Protobuf as their serialization, partly due to its use in gRPC.