Skip to content

prime399/study-mate

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

54 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

StudyMate AI πŸŽ“

Enterprise-grade study management platform powered by Heroku Managed Inference and Model Context Protocol (MCP)

StudyMate AI is a comprehensive learning platform that combines advanced AI capabilities with real-time analytics to help students optimize their study habits. Built on Heroku's Managed Inference infrastructure with MCP tool integration, it features MentorMind - an intelligent AI assistant capable of processing external resources and providing context-aware study guidance.

✨ Key Features

Core Functionality

  • πŸ•’ Smart Study Timer - Customizable Pomodoro sessions with automatic progress tracking and statistics
  • πŸ€– MentorMind AI Assistant - RAG-powered AI with MCP tool integration for external resource access
  • πŸ“ˆ Performance Analytics - Comprehensive visual dashboards with trend analysis and insights
  • πŸ‘₯ Study Groups - Collaborative workspaces with real-time messaging and leaderboards
  • βœ… Task Management - Kanban-style todo board with drag-and-drop, priorities, and status tracking
  • πŸ† Competitive Features - Global and group-specific leaderboards to encourage engagement
  • πŸ“… Calendar Integration - Study session scheduling and planning tools

AI Capabilities (MentorMind)

  • Multi-Model Architecture - Dynamic routing between GPT-OSS 120B, Nova Lite/Pro, and Claude 3.5 Haiku
  • MCP Tool Integration - Automatic external resource fetching and processing via Model Context Protocol
  • Context-Aware Responses - Personalized advice based on user study patterns, performance metrics, and group data
  • URL Processing - Automatic extraction and analysis of content from study materials (PDFs, articles, documentation)
  • Real-Time Adaptation - Continuously learns from user interactions and study outcomes

πŸš€ Quick Start

Prerequisites

Required

  • Node.js 18+ (LTS recommended)
  • pnpm 8+ (package manager)
  • Git for version control
  • Convex account (convex.dev) - Backend infrastructure
  • Heroku Inference API access for AI features

Recommended

  • Vercel account for frontend deployment
  • Code editor with TypeScript support (VS Code recommended)
  • API testing tool (Postman, Insomnia, or similar) for development

1. Clone & Install

git clone https://github.com/prime399/study-mate.git
cd study-mate
pnpm install

2. Environment Configuration

Create .env.local in the project root for Next.js environment variables:

# ======================
# CONVEX CONFIGURATION
# ======================
CONVEX_DEPLOYMENT=your-deployment-name
NEXT_PUBLIC_CONVEX_URL=https://your-deployment.convex.cloud

# ======================
# HEROKU INFERENCE - AI MODELS
# ======================
# Base URL for Heroku Managed Inference
HEROKU_INFERENCE_URL=https://us.inference.heroku.com

# Model identifiers (comma-separated for multi-model support)
HEROKU_INFERENCE_MODEL_ID=gpt-oss-120b,nova-lite,claude-3-5-haiku,nova-pro

# API Keys for each model (at least one required)
HEROKU_INFERENCE_KEY_OSS=your-oss-key          # For GPT-OSS 120B
HEROKU_INFERENCE_KEY_CLAUDE=your-claude-key    # For Claude 3.5 Haiku
HEROKU_INFERENCE_KEY_NOVA_LITE=your-lite-key   # For Nova Lite
HEROKU_INFERENCE_KEY_NOVA_PRO=your-pro-key     # For Nova Pro

# ======================
# APPLICATION SETTINGS
# ======================
# Base URL for API endpoints (required for MCP tool discovery)
NEXT_PUBLIC_APP_URL=http://localhost:3000

# ======================
# ANALYTICS (Optional)
# ======================
VERCEL_ANALYTICS_ID=your-analytics-id

Obtaining API Keys

Heroku Inference API Keys:

  1. Sign up at Heroku
  2. Navigate to Account Settings β†’ API Keys
  3. Generate separate keys for each model you plan to use
  4. Copy the keys to your .env.local file

Important Notes:

  • At least one AI model key is required for MentorMind to function
  • NEXT_PUBLIC_APP_URL must match your deployment URL in production
  • Never commit .env.local to version control

3. Convex Setup

# Initialize Convex
npx convex dev

# Set up authentication providers (optional)
npx convex env set AUTH_GITHUB_ID your-github-client-id
npx convex env set AUTH_GITHUB_SECRET your-github-secret
npx convex env set AUTH_GOOGLE_ID your-google-client-id
npx convex env set AUTH_GOOGLE_SECRET your-google-secret

4. Run Development Server

pnpm dev

Visit http://localhost:3000 to see your app!

πŸ“ Project Structure

StudyMate/
β”œβ”€β”€ app/                    # Next.js app directory
β”‚   β”œβ”€β”€ (protected)/       # Protected routes (dashboard, groups, etc.)
β”‚   β”‚   └── dashboard/
β”‚   β”‚       β”œβ”€β”€ ai-helper/  # MentorMind AI assistant
β”‚   β”‚       β”œβ”€β”€ study/      # Study timer and analytics
β”‚   β”‚       β”œβ”€β”€ todos/      # Task management board
β”‚   β”‚       β”œβ”€β”€ groups/     # Study groups and messaging
β”‚   β”‚       └── calendar/   # Study session planning
β”‚   β”œβ”€β”€ api/               # API routes
β”‚   β”‚   └── ai-helper/     # AI assistant endpoints
β”‚   └── signin/            # Authentication pages
β”œβ”€β”€ components/            # Reusable UI components
β”‚   β”œβ”€β”€ ui/               # Base UI components (shadcn/ui)
β”‚   └── ...               # Feature-specific components
β”œβ”€β”€ convex/               # Backend functions and schema
β”‚   β”œβ”€β”€ _generated/       # Generated Convex types
β”‚   β”œβ”€β”€ schema.ts         # Database schema definition
β”‚   β”œβ”€β”€ auth.ts           # Authentication functions
β”‚   β”œβ”€β”€ study.ts          # Study session management
β”‚   β”œβ”€β”€ groups.ts         # Group management
β”‚   β”œβ”€β”€ todos.ts          # Task management
β”‚   └── ...               # Other backend functions
β”œβ”€β”€ hooks/                # Custom React hooks
β”œβ”€β”€ lib/                  # Utility functions and shared logic
β”œβ”€β”€ store/                # State management
β”œβ”€β”€ types/                # TypeScript type definitions
└── public/              # Static assets and favicons

πŸ—οΈ Architecture

Frontend Stack

  • Framework: Next.js 14 with App Router and Server Components
  • Language: TypeScript with strict mode enabled
  • UI Library: Shadcn/ui components built on Radix UI
  • Styling: Tailwind CSS with custom design system
  • State Management: Convex React hooks with optimistic updates
  • Real-time Communication: WebSocket connections via Convex

Backend Infrastructure

  • Database: Convex (serverless, real-time NoSQL)
  • Authentication: Convex Auth with OAuth providers (GitHub, Google)
  • API Layer: Next.js API routes with TypeScript
  • AI Integration: Heroku Managed Inference with multi-model support
  • MCP Tools: Model Context Protocol for external resource access

MentorMind AI Architecture

RAG (Retrieval-Augmented Generation) Pipeline

User Query β†’ Context Retrieval β†’ Model Selection β†’ Response Generation
     ↓              ↓                    ↓                  ↓
  Message     Study Data          AI Model           Personalized
  History     Performance        (via Heroku)          Response
              Group Info

Components:

  1. Context Retrieval Layer

    • Fetches user study statistics from Convex
    • Retrieves group membership and collaboration data
    • Collects performance metrics and learning patterns
    • Builds comprehensive user profile for contextualization
  2. Model Routing Engine

    • Analyzes query complexity and requirements
    • Selects optimal AI model (GPT-OSS, Nova, Claude)
    • Routes requests to appropriate Heroku Inference endpoint
    • Falls back to alternative models if primary unavailable
  3. MCP Tool Integration

    • Discovers available tools from /v1/mcp/servers endpoint
    • Automatically provisions all tools to AI agent
    • Parses Server-Sent Events (SSE) responses
    • Handles tool invocations transparently
  4. Response Processing

    • Extracts AI-generated content from responses
    • Processes tool invocation results
    • Formats output for user consumption
    • Maintains conversation context

MCP (Model Context Protocol) Implementation

The MCP integration enables MentorMind to access external resources dynamically:

Architecture Flow:

// 1. Tool Discovery (on each request)
const mcpTools = await fetchAvailableMcpTools()
// Fetches from: /v1/mcp/servers

// 2. System Prompt Enhancement
systemPrompt += `
You have access to the following MCP tools:
${toolsList}
Use these tools proactively when they can help.
`

// 3. Request to Heroku Agents Endpoint
POST /v1/agents/heroku
{
  model: "gpt-oss-120b",
  messages: [...],
  tools: [
    { type: "mcp", name: "fetch/read_url" },
    // All available tools included
  ]
}

// 4. SSE Response Parsing
data: {"object": "chat.completion", "choices": [...]}
data: [DONE]

Key Design Decisions:

  • Dynamic Tool Discovery: Tools are fetched on every request to ensure availability
  • All-Tools Provisioning: All discovered tools are sent to the AI, letting it decide usage
  • Graceful Degradation: Falls back to standard chat completions if no tools available
  • SSE Parsing: Custom parser handles Heroku's streaming response format

🚒 Deployment

Vercel (Frontend)

  1. Connect your repository to Vercel
  2. Set environment variables in Vercel dashboard:
    CONVEX_DEPLOYMENT=your-deployment-name
    NEXT_PUBLIC_CONVEX_URL=https://your-deployment.convex.cloud
    HEROKU_INFERENCE_URL=https://api.heroku.com/ai
    HEROKU_INFERENCE_KEY_OSS=your-oss-key
    # ... other AI model keys
  3. Deploy - Vercel will automatically build and deploy

Convex (Backend)

# Deploy to production
npx convex deploy

# Set production environment variables
npx convex env set AUTH_GITHUB_ID your-github-id --prod
npx convex env set AUTH_GITHUB_SECRET your-github-secret --prod
# ... repeat for other variables

πŸ› οΈ Development

Available Scripts

pnpm dev          # Start development (frontend + backend)
pnpm dev:frontend # Start only Next.js dev server
pnpm dev:backend  # Start only Convex dev server
pnpm build        # Build for production
pnpm lint         # Run ESLint

Database Schema

The app uses Convex with the following main tables:

  • users - User profiles and authentication
  • studySessions - Study session tracking
  • studySettings - User preferences and goals
  • groups - Study group information
  • groupMembers - Group membership and roles
  • messages - Group chat messages
  • todos - Task management with status and priority

πŸ”§ Advanced Configuration

AI Model Selection

Configure at least one AI model for MentorMind functionality. Each model has different characteristics:

Model Environment Variable Use Case Response Time Cost
GPT-OSS 120B HEROKU_INFERENCE_KEY_OSS General-purpose, recommended default Medium Low
Nova Lite HEROKU_INFERENCE_KEY_NOVA_LITE Quick responses, simple queries Fast Low
Nova Pro HEROKU_INFERENCE_KEY_NOVA_PRO Balanced performance and quality Medium Medium
Claude 3.5 Haiku HEROKU_INFERENCE_KEY_CLAUDE Complex reasoning, analysis Slower Higher

Model Routing Logic:

  • User can manually select model from UI dropdown
  • "Auto" mode analyzes query complexity and routes to optimal model
  • System automatically falls back to available models if primary is unavailable

MCP (Model Context Protocol) Setup

Prerequisites for MCP Tools

  1. Heroku MCP Servers: Must be deployed and registered with Heroku Inference
  2. API Access: Requires valid HEROKU_INFERENCE_KEY_OSS or equivalent
  3. Network Access: Application must reach https://us.inference.heroku.com

Available MCP Tools

The system dynamically discovers tools from /v1/mcp/servers. Common tools include:

  • fetch/read_url - Fetches and reads content from URLs
  • fetch/read_pdf - Extracts text from PDF documents
  • search/web_search - Performs web searches (if configured)

Deploying Custom MCP Servers

To add custom MCP tools to your deployment:

# 1. Create MCP server following Heroku's specifications
# Reference: https://github.com/heroku/mcp-doc-reader

# 2. Deploy server to Heroku
heroku create my-mcp-server
git push heroku main

# 3. Register with Heroku Inference
# Contact Heroku support to register your MCP server

# 4. Tools will automatically appear in StudyMate

MCP Configuration Verification

Test your MCP setup:

# Check if MCP endpoint is accessible
curl -H "Authorization: Bearer YOUR_KEY" \
  https://us.inference.heroku.com/v1/mcp/servers

# Should return JSON array of available servers and tools

Authentication Providers

GitHub OAuth Setup

  1. Navigate to GitHub Developer Settings
  2. Click "New OAuth App"
  3. Fill in application details:
    • Application name: StudyMate AI
    • Homepage URL: https://your-domain.com
    • Authorization callback URL: https://your-deployment.convex.site/api/auth/callback/github
  4. Copy Client ID and Client Secret
  5. Add to Convex environment:
    npx convex env set AUTH_GITHUB_ID your-client-id
    npx convex env set AUTH_GITHUB_SECRET your-client-secret

Google OAuth Setup

  1. Go to Google Cloud Console
  2. Create a new project or select existing
  3. Navigate to "Credentials" β†’ "Create Credentials" β†’ "OAuth 2.0 Client ID"
  4. Configure OAuth consent screen if prompted
  5. Set application type to "Web application"
  6. Add authorized redirect URIs:
    • https://your-deployment.convex.site/api/auth/callback/google
  7. Copy Client ID and Client Secret
  8. Add to Convex environment:
    npx convex env set AUTH_GOOGLE_ID your-client-id
    npx convex env set AUTH_GOOGLE_SECRET your-client-secret

Coin System Configuration

StudyMate uses a coin-based system for AI interactions:

  • Earn Coins: 1 coin per second of active study time
  • Spend Coins: 5 coins per AI query (configurable)
  • Starting Balance: 100 coins for new users

To modify coin costs, update in app/api/ai-helper/_lib/coin-system.ts:

export const COINS_PER_QUERY = 5  // Adjust as needed
export const INITIAL_BALANCE = 100

πŸ” Troubleshooting

Common Issues and Solutions

MentorMind Not Responding

Symptom: AI assistant shows error or doesn't respond to queries

Possible Causes & Solutions:

  1. Missing API Keys

    # Verify environment variables are set
    echo $HEROKU_INFERENCE_KEY_OSS
    # Should output your API key, not empty

    Solution: Ensure at least one model API key is configured in .env.local

  2. Invalid API Key

    # Test API key validity
    curl -H "Authorization: Bearer YOUR_KEY" \
      https://us.inference.heroku.com/v1/models

    Solution: If 401/403 error, regenerate API key from Heroku dashboard

  3. Insufficient Coins

    • Check coin balance in UI footer
    • Solution: Start a study session to earn coins (1 coin/second)

MCP Tools Not Working

Symptom: AI can't access external URLs or PDFs

Diagnostics:

# 1. Check MCP servers endpoint
curl -H "Authorization: Bearer YOUR_KEY" \
  https://us.inference.heroku.com/v1/mcp/servers

# 2. Verify NEXT_PUBLIC_APP_URL is set correctly
echo $NEXT_PUBLIC_APP_URL
# Should match your deployment URL

# 3. Check browser console for errors
# Look for: "Failed to fetch MCP tools"

Solutions:

  • Ensure NEXT_PUBLIC_APP_URL environment variable is set
  • Verify Heroku MCP servers are deployed and registered
  • Check network connectivity to Heroku Inference API

Convex Connection Issues

Symptom: "Connecting to Convex..." stuck or data not updating

Solutions:

  1. Check Convex Status

  2. Verify Deployment URL

    # Check NEXT_PUBLIC_CONVEX_URL
    grep CONVEX_URL .env.local
    # Should match your Convex dashboard URL
  3. Re-authenticate with Convex

    npx convex dev
    # Follow prompts to re-authenticate

Build Errors

Issue: TypeScript compilation errors

# Common fixes:

# 1. Clear Next.js cache
rm -rf .next
pnpm dev

# 2. Regenerate Convex types
npx convex dev
# Let it run until types are generated

# 3. Reinstall dependencies
rm -rf node_modules pnpm-lock.yaml
pnpm install

# 4. Check Node.js version
node --version
# Should be 18.x or higher

Authentication Not Working

Symptom: Can't sign in with GitHub/Google

Checklist:

  • OAuth app created in GitHub/Google
  • Callback URLs correctly configured
  • Environment variables set in Convex
  • Application is using HTTPS (required for OAuth)

Debugging:

# Check Convex auth configuration
npx convex env list

# Should show:
# AUTH_GITHUB_ID=...
# AUTH_GITHUB_SECRET=...

Study Sessions Not Saving

Symptom: Study timer runs but sessions don't appear in analytics

Possible Causes:

  1. Not authenticated - User must be signed in
  2. Convex connection lost - Check network tab in browser devtools
  3. Database mutation failed - Check Convex dashboard logs

Solution:

// Check browser console for errors
// Look for: "Failed to create study session"

// Verify user is authenticated
// Check: User profile icon appears in top-right

Performance Issues

Slow AI Responses

  1. Switch to faster model: Select "Nova Lite" from model dropdown
  2. Check network latency:
    ping us.inference.heroku.com
  3. Verify no rate limiting: Check Heroku dashboard for API limits

UI Lag or Freezing

  1. Clear browser cache: Ctrl+Shift+R (Windows/Linux) or Cmd+Shift+R (Mac)
  2. Disable browser extensions: Test in incognito mode
  3. Check system resources: AI responses can be memory-intensive

Getting Help

If issues persist after trying these solutions:

  1. Check Logs:

    • Browser Console (F12 β†’ Console tab)
    • Convex Dashboard Logs
    • Vercel Deployment Logs
  2. Create Issue: GitHub Issues

    • Include error messages
    • Describe steps to reproduce
    • Share environment details (OS, browser, Node version)
  3. Community Support: Join discussions in GitHub Discussions

🀝 Contributing

  1. Fork the repository
  2. Create a feature branch: git checkout -b feature/amazing-feature
  3. Commit changes: git commit -m 'Add amazing feature'
  4. Push to branch: git push origin feature/amazing-feature
  5. Open a Pull Request

πŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

πŸ“š Technical Stack Summary

Frontend

Technology Version Purpose
Next.js 14.x React framework with App Router
TypeScript 5.x Type-safe development
Tailwind CSS 3.x Utility-first styling
Shadcn/ui Latest Component library
Radix UI Latest Accessible primitives

Backend & Infrastructure

Technology Purpose
Convex Real-time serverless database
Heroku Managed Inference Multi-model AI hosting
Model Context Protocol (MCP) External resource integration
Vercel Frontend hosting and deployment

AI Models

Model Provider Use Case
GPT-OSS 120B Heroku General-purpose queries
Nova Lite Heroku Fast responses
Nova Pro Heroku Balanced performance
Claude 3.5 Haiku Anthropic (via Heroku) Complex reasoning

πŸ”— Resources & Links

Application

Documentation

Community

πŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for complete details.

πŸ™ Acknowledgments

  • Heroku for Managed Inference platform and MCP infrastructure
  • Convex for real-time database and backend services
  • Vercel for seamless frontend hosting
  • Anthropic for Claude models
  • Open-source community for various tools and libraries

StudyMate AI - Enterprise-grade study management powered by Heroku Managed Inference and Model Context Protocol.

Built for students worldwide. Transform your study habits with intelligent AI assistance.

About

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •