Systems | Development | Analytics | API | Testing

AI

AI and RAG with Gemma, Ollama, and Logi Symphony

Local LLMs are becoming mainstream with sites like HuggingFace promoting open sharing of trained LLMs. These LLMs are often very small but still extremely accurate, especially for domain-specific tasks like medicine, finance, law, and others. Gemma is a multi-purpose LLM and, while small, is competitive and accurate. Local LLMs also have the advantage of being completely run inside your own environment.

Navigating AI-Driven Claims Processing

95% of insurers are currently accelerating their digital transformation with AI-driven claims processing. Traditionally, this process involved manual steps such as claim initiation, data entry, validation, decision-making, and payout, consuming significant time and resources. However, the introduction of AI has replaced tedious manual work, enabling companies to streamline their tasks efficiently.

Establishing A Robust Data Foundation To Maximize The Benefits Of Gen AI

Newly appointed Snowflake CEO Sridhar Ramaswamy joins Snowflake's Director of Engineering Mona Attariyan and "Data Cloud Now" anchor Ryan Green to discuss the need for organizations to prepare themselves to take full advantage of Gen AI by implementing a carefully developed data strategy that eliminates data silos and promotes data sharing while protecting data privacy.

Gen AI for Customer Service Demo

Iguazio would like to introduce two practical demonstrations showcasing our call center analysis tool and our innovative GenAI assistant. These demos illustrate how our GenAI assistant supports call center agents with real-time advice and recommendations during customer calls. This technology aims to improve customer interactions and boost call center efficiency. We're eager to share how our solutions can transform call center operations.

How Financial Services Should Prepare for Generative AI

It’s no surprise that ever since ChatGPT’s broader predictive capabilities were made available to the public in November 2022, the sprawl of stakeholder capitalization on large language models (LLMs) has permeated nearly every sector of modern industry, accompanied or exacerbated by collective fascination. Financial services is no exception. But what might this transformation look like, from practical applications to potential risks?

What is RAG? Retrieval-Augmented Generation for AI

Retrieval-augmented generation (RAG) is an AI framework and powerful approach in NLP (Natural Language Processing) where generative AI models are enhanced with external knowledge sources and retrieval-based mechanisms. These appended pieces of outside knowledge provide the model with accurate, up-to-date information that supplements the LLM’s existing internal representation of information. As the name suggests, RAG models have a retrieval component and a generation component.

Harness Generative AI in Your Processes with the Prompt Builder AI Skill

Over the past year, interest in artificial intelligence has surged due to the proliferation of generative AI and large language models. These tools captured imaginations, demonstrating a technology brimming with possibility. While many focused on the potential of these tools, some companies made AI practical. For example, last year, Appian released packaged AI tools for processing content at scale and quickly building interface forms.

How Apps Bring Gen AI & LLMs To Life

In this conversation with Snowflake's Christian Kleinerman, Amanda Kelly, and Adrien Treuille, "Data Cloud Now" anchor Ryan Green discusses the origins of Streamlit, its exponential growth as an application development tool since being acquired by Snowflake, and the important role it is playing in the development of machine learning models across all industries. This wide-ranging conversation also explores the ways Gen AI and LLMs will transform the application development process and touches on the role the Open Source community will play in that transformation.