Systems | Development | Analytics | API | Testing

Latest Posts

Announcing Unravel 4.8.1: Maximize business value with Google Cloud BigQuery Editions pricing

Google recently introduced significant changes to its existing BigQuery pricing models, affecting both compute and storage. They announced the end of sale for flat-rate and flex slots for all BigQuery customers not currently in a contract. Google announced an increase to the price of on-demand analysis by 25% across all regions, starting on July 5, 2023.

Harnessing Google Cloud BigQuery for Speed and Scale: Data Observability, FinOps, and Beyond

Data is a powerful force that can generate business value with immense potential for businesses and organizations across industries. Leveraging data and analytics has become a critical factor for successful digital transformation that can accelerate revenue growth and AI innovation.

Unlocking Cost Optimization: Insights from FinOps Camp Episode #1

With the dramatic increase in the volume, velocity, and variety of data analytics projects, understanding costs and optimizing expenditure is crucial for success. Data teams often face challenges in effectively managing costs, accurately attributing them, and finding ways to enhance cost efficiency.

Healthcare leader uses AI insights to boost data pipeline efficiency

One of the largest health insurance providers in the United States uses Unravel to ensure that its business-critical data applications are optimized for performance, reliability, and cost in its development environment—before they go live in production. Data and data-driven statistical analysis have always been at the core of health insurance.

AI-Driven Observability for Snowflake

Performance. Reliability. Cost-effectiveness. Unravel is a data observability platform that provides cost intelligence, warehouse optimization, query optimization, and automated alerting and actions for high-volume users of the Snowflake Data Cloud. Unravel leverages AI and automation to deliver realtime, user-level and query-level cost reporting, code-level optimization recommendations, and automated spend controls to empower and unify DataOps and FinOps teams.

Logistics giant optimizes cloud data costs up front at speed & scale

One of the world’s largest logistics companies leverages automation and AI to empower every individual data engineer with self-service capability to optimize their jobs for performance and cost. The company was able to cut its cloud data costs by 70% in six months—and keep them down with automated 360° cost visibility, prescriptive guidance, and guardrails for its 3,000 data engineers across the globe.