Generative AI and large language models (LLMs) are revolutionizing many aspects of both developer and non-coder productivity with automation of repetitive tasks and fast generation of insights from large amounts of data. Snowflake users are already taking advantage of LLMs to build really cool apps with integrations to web-hosted LLM APIs using external functions, and using Streamlit as an interactive front end for LLM-powered apps such as AI plagiarism detection, AI assistant, and MathGPT.
Welcome to the third blog post in our series highlighting Snowflake’s data ingestion capabilities, covering the latest on Snowpipe Streaming (currently in public preview) and how streaming ingestion can accelerate data engineering on Snowflake.
Manufacturers today are implementing a range of new technologies to increase operational efficiency and create visibility and flexibility across value chains. These include robotics, automation, data analytics, IoT, and artificial intelligence (AI) and machine learning (ML), according to Deloitte. Company leaders hope these innovations will help them create more productive and resilient supply chains, improve production quality and efficiency, and mitigate risks.
As announced at Snowflake Summit 2022, Iceberg Tables combines unique Snowflake capabilities with Apache Iceberg and Apache Parquet open source projects to support your architecture of choice. As part of the latest Iceberg release, we’ve added catalog support to the Iceberg project to ensure that engines outside of Snowflake can interoperate with Iceberg Tables.
It’s been a decade since “connected” objects—commonly referred to as “the internet of things” (IoT)— reached broad audiences. Connected toothbrushes, sensors embedded in sneakers, and smart watches have started to change consumer behavior through a data-driven, gamified approach. Technology has rapidly evolved to handle large data volumes at high velocities and big data analytics. AI has become more democratized.