🦙 Local Chat: Fast, Private, and Fully Offline AI

🧠 Locally Hosted AI Chat

This app allows you to chat privately with open-source language models — running directly on your own OCI cloud instance.

🗂️ Available Models:

  • Meta-LLaMA 3 8B – balanced performance for general tasks
  • Mistral 7B Instruct – great for concise, high-quality answers
  • Gemma 2B IT – efficient and lightweight, ideal for limited resources
  • Phi-2 – small, fast model optimized for reasoning and math

🔐 Features:

  • Fully offline inference, no external API required
  • Hosted with NGINX + HTTPS on Oracle Cloud (Free Tier)
  • Powered by llama.cpp using GGUF format models
  • Interactive web UI built with Streamlit
  • Low-latency responses and complete data privacy
Scroll to Top