Skip to content

lord-fourth0107/ai-rag-project

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

76 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AI Finetuned RAG System CS-GY-6613

Submitted By:

Sl.N0 Name Net ID
1 Uttam Singh us2193
2 Namani Shreeharsh sn4165

Platform Used for Docker Desktop: MAC OS

Docker images are platform independent but not the daemon . Hence to have compatibity for clearml we have noted few steps to run the RAG projects

Steps to run the RAG model on ROS.

    1. Docker Desktop or docker daemon should be installed

Step1 Prerequisites:

Follow the steps from 1 to 11 in the below link to create clearml related services (docker services such as clearml-fileserver, clearml-apiserver, clearml-webserver) this will result in a clearml specific docker compose file., which should be like clearml-compose.yml file in the final-instruct-db branch

https://clear.ml/docs/latest/docs/deploying_clearml/clearml_server_linux_mac/

Step2:

run the below command to launch all the required docker services and images, in the project directory:

docker compose -f /opt/clearml/docker-compose.yml -f docker-compose.yml up -d

Step3:

run a command inside the ollama image launched from the above docker compose to run the finetuned model, which is pushed to huggingface

Ollama provides good orchestration and library to pull multiple models from huggingface or opensource . This is one of the reason we went with ollama docker image to run the finetuned model. This inherently decouples the model with code. So we can now push any changes to model and code independently of each other

docker exec -it ollama_container ollama run hf.co/nsh22/ROS-gguf

This sets up the RAG model and code to run together and the gradio app can be launched in browser

Submission of the project:

Docker Compose

image

Gradio app results

The below submission contains the screenshots of the question and answers in Gradio application:

Screenshot 2024-12-08 at 13 45 10 Screenshot 2024-12-08 at 13 44 53 Screenshot 2024-12-08 at 13 41 37

Without context result of LLM

Screenshot 2024-12-08 at 8 12 26 PM

Google drive link for the video of inference pipeline is

https://drive.google.com/file/d/1RBNVO0tswMs-vpi1GFG18rR7a-RVtXdu/view?usp=drive_link

Difference: With Context enhanced prompt the response generated is more latest as it contains instructions for wide range of OS while without context we have response only for Ubuntu for which ROS was first developed

Instruction Datatset In Hugging Face

image

Clearml Screenshots

Screenshot 2024-12-08 at 2 40 22 PM Screenshot 2024-12-08 at 2 41 02 PM

How to check for data in mongoDB

Prerequisites mogosh installed

Run command on terminal

mongosh --port 27017

Run the below commands in steps

show dbs
use rag
show collections
db.repositories.find()

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •