Build Privacy focused React Applications with Ollama, NextJS/React and LangChainJS

You can join this workshop live remotely only with Multipass or Full conference ticket
Attend remote workshop live on Dec 4, 15:00
Rate this content
Bookmark

Today, most of the AI applications send data to LLM cloud providers like OpenAI, raising privacy concerns. This workshop an alternative and privacy focused way to build AI applications by running LLMs locally with Ollama that keep everything local on your computer. This approach allows to avoid sending sensitive information to external servers. The workshop also highlights LangChain's ability to create versatile AI agents capable of handling tasks autonomously by creating embeddings for the data. So come learn how can you build the next gen, privacy focused React application powered by Local LLMs.

The workshop covers the following topics:

1.Overview of cloud-based LLMs privacy issues and the importance of running Local LLM inferencing.

2.Detailed insights into generating embeddings with tools like Ollama and demonstrating how LangChain agents can perform tasks such as document summarisation and API interactions, all while maintaining data privacy in a NextJS / React application.

3.Discovering practical use-cases for this approach.

This workshop has been presented at React Day Berlin 2024, check out the latest edition of this React Conference.

Shivay Lamba
Shivay Lamba
04 Dec, 2024
Video transcription, chapters and summary will be available later.