React Day Berlin 2024
Upcoming
Build Privacy focused React Applications with Ollama, NextJS/React and LangChainJS
Workshop
Today, most of the AI applications send data to LLM cloud providers like OpenAI, raising privacy concerns. This workshop an alternative and privacy focused way to build AI applications by running LLMs locally with Ollama that keep everything local on your computer. This approach allows to avoid sending sensitive information to external servers. The workshop also highlights LangChain's ability to create versatile AI agents capable of handling tasks autonomously by creating embeddings for the data. So come learn how can you build the next gen, privacy focused React application powered by Local LLMs.
The workshop covers the following topics:
1.Overview of cloud-based LLMs privacy issues and the importance of running Local LLM inferencing.
2.Detailed insights into generating embeddings with tools like Ollama and demonstrating how LangChain agents can perform tasks such as document summarisation and API interactions, all while maintaining data privacy in a NextJS / React application.
3.Discovering practical use-cases for this approach.
The workshop covers the following topics:
1.Overview of cloud-based LLMs privacy issues and the importance of running Local LLM inferencing.
2.Detailed insights into generating embeddings with tools like Ollama and demonstrating how LangChain agents can perform tasks such as document summarisation and API interactions, all while maintaining data privacy in a NextJS / React application.
3.Discovering practical use-cases for this approach.