Enterprise-grade study management platform powered by Heroku Managed Inference and Model Context Protocol (MCP)
StudyMate AI is a comprehensive learning platform that combines advanced AI capabilities with real-time analytics to help students optimize their study habits. Built on Heroku's Managed Inference infrastructure with MCP tool integration, it features MentorMind - an intelligent AI assistant capable of processing external resources and providing context-aware study guidance.
- π Smart Study Timer - Customizable Pomodoro sessions with automatic progress tracking and statistics
- π€ MentorMind AI Assistant - RAG-powered AI with MCP tool integration for external resource access
- π Performance Analytics - Comprehensive visual dashboards with trend analysis and insights
- π₯ Study Groups - Collaborative workspaces with real-time messaging and leaderboards
- β Task Management - Kanban-style todo board with drag-and-drop, priorities, and status tracking
- π Competitive Features - Global and group-specific leaderboards to encourage engagement
- π Calendar Integration - Study session scheduling and planning tools
- Multi-Model Architecture - Dynamic routing between GPT-OSS 120B, Nova Lite/Pro, and Claude 3.5 Haiku
- MCP Tool Integration - Automatic external resource fetching and processing via Model Context Protocol
- Context-Aware Responses - Personalized advice based on user study patterns, performance metrics, and group data
- URL Processing - Automatic extraction and analysis of content from study materials (PDFs, articles, documentation)
- Real-Time Adaptation - Continuously learns from user interactions and study outcomes
- Node.js 18+ (LTS recommended)
- pnpm 8+ (package manager)
- Git for version control
- Convex account (convex.dev) - Backend infrastructure
- Heroku Inference API access for AI features
- Vercel account for frontend deployment
- Code editor with TypeScript support (VS Code recommended)
- API testing tool (Postman, Insomnia, or similar) for development
git clone https://github.com/prime399/study-mate.git
cd study-mate
pnpm installCreate .env.local in the project root for Next.js environment variables:
# ======================
# CONVEX CONFIGURATION
# ======================
CONVEX_DEPLOYMENT=your-deployment-name
NEXT_PUBLIC_CONVEX_URL=https://your-deployment.convex.cloud
# ======================
# HEROKU INFERENCE - AI MODELS
# ======================
# Base URL for Heroku Managed Inference
HEROKU_INFERENCE_URL=https://us.inference.heroku.com
# Model identifiers (comma-separated for multi-model support)
HEROKU_INFERENCE_MODEL_ID=gpt-oss-120b,nova-lite,claude-3-5-haiku,nova-pro
# API Keys for each model (at least one required)
HEROKU_INFERENCE_KEY_OSS=your-oss-key # For GPT-OSS 120B
HEROKU_INFERENCE_KEY_CLAUDE=your-claude-key # For Claude 3.5 Haiku
HEROKU_INFERENCE_KEY_NOVA_LITE=your-lite-key # For Nova Lite
HEROKU_INFERENCE_KEY_NOVA_PRO=your-pro-key # For Nova Pro
# ======================
# APPLICATION SETTINGS
# ======================
# Base URL for API endpoints (required for MCP tool discovery)
NEXT_PUBLIC_APP_URL=http://localhost:3000
# ======================
# ANALYTICS (Optional)
# ======================
VERCEL_ANALYTICS_ID=your-analytics-idHeroku Inference API Keys:
- Sign up at Heroku
- Navigate to Account Settings β API Keys
- Generate separate keys for each model you plan to use
- Copy the keys to your
.env.localfile
Important Notes:
- At least one AI model key is required for MentorMind to function
NEXT_PUBLIC_APP_URLmust match your deployment URL in production- Never commit
.env.localto version control
# Initialize Convex
npx convex dev
# Set up authentication providers (optional)
npx convex env set AUTH_GITHUB_ID your-github-client-id
npx convex env set AUTH_GITHUB_SECRET your-github-secret
npx convex env set AUTH_GOOGLE_ID your-google-client-id
npx convex env set AUTH_GOOGLE_SECRET your-google-secretpnpm devVisit http://localhost:3000 to see your app!
StudyMate/
βββ app/ # Next.js app directory
β βββ (protected)/ # Protected routes (dashboard, groups, etc.)
β β βββ dashboard/
β β βββ ai-helper/ # MentorMind AI assistant
β β βββ study/ # Study timer and analytics
β β βββ todos/ # Task management board
β β βββ groups/ # Study groups and messaging
β β βββ calendar/ # Study session planning
β βββ api/ # API routes
β β βββ ai-helper/ # AI assistant endpoints
β βββ signin/ # Authentication pages
βββ components/ # Reusable UI components
β βββ ui/ # Base UI components (shadcn/ui)
β βββ ... # Feature-specific components
βββ convex/ # Backend functions and schema
β βββ _generated/ # Generated Convex types
β βββ schema.ts # Database schema definition
β βββ auth.ts # Authentication functions
β βββ study.ts # Study session management
β βββ groups.ts # Group management
β βββ todos.ts # Task management
β βββ ... # Other backend functions
βββ hooks/ # Custom React hooks
βββ lib/ # Utility functions and shared logic
βββ store/ # State management
βββ types/ # TypeScript type definitions
βββ public/ # Static assets and favicons
- Framework: Next.js 14 with App Router and Server Components
- Language: TypeScript with strict mode enabled
- UI Library: Shadcn/ui components built on Radix UI
- Styling: Tailwind CSS with custom design system
- State Management: Convex React hooks with optimistic updates
- Real-time Communication: WebSocket connections via Convex
- Database: Convex (serverless, real-time NoSQL)
- Authentication: Convex Auth with OAuth providers (GitHub, Google)
- API Layer: Next.js API routes with TypeScript
- AI Integration: Heroku Managed Inference with multi-model support
- MCP Tools: Model Context Protocol for external resource access
User Query β Context Retrieval β Model Selection β Response Generation
β β β β
Message Study Data AI Model Personalized
History Performance (via Heroku) Response
Group Info
Components:
-
Context Retrieval Layer
- Fetches user study statistics from Convex
- Retrieves group membership and collaboration data
- Collects performance metrics and learning patterns
- Builds comprehensive user profile for contextualization
-
Model Routing Engine
- Analyzes query complexity and requirements
- Selects optimal AI model (GPT-OSS, Nova, Claude)
- Routes requests to appropriate Heroku Inference endpoint
- Falls back to alternative models if primary unavailable
-
MCP Tool Integration
- Discovers available tools from
/v1/mcp/serversendpoint - Automatically provisions all tools to AI agent
- Parses Server-Sent Events (SSE) responses
- Handles tool invocations transparently
- Discovers available tools from
-
Response Processing
- Extracts AI-generated content from responses
- Processes tool invocation results
- Formats output for user consumption
- Maintains conversation context
The MCP integration enables MentorMind to access external resources dynamically:
Architecture Flow:
// 1. Tool Discovery (on each request)
const mcpTools = await fetchAvailableMcpTools()
// Fetches from: /v1/mcp/servers
// 2. System Prompt Enhancement
systemPrompt += `
You have access to the following MCP tools:
${toolsList}
Use these tools proactively when they can help.
`
// 3. Request to Heroku Agents Endpoint
POST /v1/agents/heroku
{
model: "gpt-oss-120b",
messages: [...],
tools: [
{ type: "mcp", name: "fetch/read_url" },
// All available tools included
]
}
// 4. SSE Response Parsing
data: {"object": "chat.completion", "choices": [...]}
data: [DONE]Key Design Decisions:
- Dynamic Tool Discovery: Tools are fetched on every request to ensure availability
- All-Tools Provisioning: All discovered tools are sent to the AI, letting it decide usage
- Graceful Degradation: Falls back to standard chat completions if no tools available
- SSE Parsing: Custom parser handles Heroku's streaming response format
- Connect your repository to Vercel
- Set environment variables in Vercel dashboard:
CONVEX_DEPLOYMENT=your-deployment-name NEXT_PUBLIC_CONVEX_URL=https://your-deployment.convex.cloud HEROKU_INFERENCE_URL=https://api.heroku.com/ai HEROKU_INFERENCE_KEY_OSS=your-oss-key # ... other AI model keys - Deploy - Vercel will automatically build and deploy
# Deploy to production
npx convex deploy
# Set production environment variables
npx convex env set AUTH_GITHUB_ID your-github-id --prod
npx convex env set AUTH_GITHUB_SECRET your-github-secret --prod
# ... repeat for other variablespnpm dev # Start development (frontend + backend)
pnpm dev:frontend # Start only Next.js dev server
pnpm dev:backend # Start only Convex dev server
pnpm build # Build for production
pnpm lint # Run ESLintThe app uses Convex with the following main tables:
- users - User profiles and authentication
- studySessions - Study session tracking
- studySettings - User preferences and goals
- groups - Study group information
- groupMembers - Group membership and roles
- messages - Group chat messages
- todos - Task management with status and priority
Configure at least one AI model for MentorMind functionality. Each model has different characteristics:
| Model | Environment Variable | Use Case | Response Time | Cost |
|---|---|---|---|---|
| GPT-OSS 120B | HEROKU_INFERENCE_KEY_OSS |
General-purpose, recommended default | Medium | Low |
| Nova Lite | HEROKU_INFERENCE_KEY_NOVA_LITE |
Quick responses, simple queries | Fast | Low |
| Nova Pro | HEROKU_INFERENCE_KEY_NOVA_PRO |
Balanced performance and quality | Medium | Medium |
| Claude 3.5 Haiku | HEROKU_INFERENCE_KEY_CLAUDE |
Complex reasoning, analysis | Slower | Higher |
Model Routing Logic:
- User can manually select model from UI dropdown
- "Auto" mode analyzes query complexity and routes to optimal model
- System automatically falls back to available models if primary is unavailable
- Heroku MCP Servers: Must be deployed and registered with Heroku Inference
- API Access: Requires valid
HEROKU_INFERENCE_KEY_OSSor equivalent - Network Access: Application must reach
https://us.inference.heroku.com
The system dynamically discovers tools from /v1/mcp/servers. Common tools include:
- fetch/read_url - Fetches and reads content from URLs
- fetch/read_pdf - Extracts text from PDF documents
- search/web_search - Performs web searches (if configured)
To add custom MCP tools to your deployment:
# 1. Create MCP server following Heroku's specifications
# Reference: https://github.com/heroku/mcp-doc-reader
# 2. Deploy server to Heroku
heroku create my-mcp-server
git push heroku main
# 3. Register with Heroku Inference
# Contact Heroku support to register your MCP server
# 4. Tools will automatically appear in StudyMateTest your MCP setup:
# Check if MCP endpoint is accessible
curl -H "Authorization: Bearer YOUR_KEY" \
https://us.inference.heroku.com/v1/mcp/servers
# Should return JSON array of available servers and tools- Navigate to GitHub Developer Settings
- Click "New OAuth App"
- Fill in application details:
- Application name: StudyMate AI
- Homepage URL:
https://your-domain.com - Authorization callback URL:
https://your-deployment.convex.site/api/auth/callback/github
- Copy Client ID and Client Secret
- Add to Convex environment:
npx convex env set AUTH_GITHUB_ID your-client-id npx convex env set AUTH_GITHUB_SECRET your-client-secret
- Go to Google Cloud Console
- Create a new project or select existing
- Navigate to "Credentials" β "Create Credentials" β "OAuth 2.0 Client ID"
- Configure OAuth consent screen if prompted
- Set application type to "Web application"
- Add authorized redirect URIs:
https://your-deployment.convex.site/api/auth/callback/google
- Copy Client ID and Client Secret
- Add to Convex environment:
npx convex env set AUTH_GOOGLE_ID your-client-id npx convex env set AUTH_GOOGLE_SECRET your-client-secret
StudyMate uses a coin-based system for AI interactions:
- Earn Coins: 1 coin per second of active study time
- Spend Coins: 5 coins per AI query (configurable)
- Starting Balance: 100 coins for new users
To modify coin costs, update in app/api/ai-helper/_lib/coin-system.ts:
export const COINS_PER_QUERY = 5 // Adjust as needed
export const INITIAL_BALANCE = 100Symptom: AI assistant shows error or doesn't respond to queries
Possible Causes & Solutions:
-
Missing API Keys
# Verify environment variables are set echo $HEROKU_INFERENCE_KEY_OSS # Should output your API key, not empty
Solution: Ensure at least one model API key is configured in
.env.local -
Invalid API Key
# Test API key validity curl -H "Authorization: Bearer YOUR_KEY" \ https://us.inference.heroku.com/v1/models
Solution: If 401/403 error, regenerate API key from Heroku dashboard
-
Insufficient Coins
- Check coin balance in UI footer
- Solution: Start a study session to earn coins (1 coin/second)
Symptom: AI can't access external URLs or PDFs
Diagnostics:
# 1. Check MCP servers endpoint
curl -H "Authorization: Bearer YOUR_KEY" \
https://us.inference.heroku.com/v1/mcp/servers
# 2. Verify NEXT_PUBLIC_APP_URL is set correctly
echo $NEXT_PUBLIC_APP_URL
# Should match your deployment URL
# 3. Check browser console for errors
# Look for: "Failed to fetch MCP tools"Solutions:
- Ensure
NEXT_PUBLIC_APP_URLenvironment variable is set - Verify Heroku MCP servers are deployed and registered
- Check network connectivity to Heroku Inference API
Symptom: "Connecting to Convex..." stuck or data not updating
Solutions:
-
Check Convex Status
- Visit Convex Status
- Verify no ongoing incidents
-
Verify Deployment URL
# Check NEXT_PUBLIC_CONVEX_URL grep CONVEX_URL .env.local # Should match your Convex dashboard URL
-
Re-authenticate with Convex
npx convex dev # Follow prompts to re-authenticate
Issue: TypeScript compilation errors
# Common fixes:
# 1. Clear Next.js cache
rm -rf .next
pnpm dev
# 2. Regenerate Convex types
npx convex dev
# Let it run until types are generated
# 3. Reinstall dependencies
rm -rf node_modules pnpm-lock.yaml
pnpm install
# 4. Check Node.js version
node --version
# Should be 18.x or higherSymptom: Can't sign in with GitHub/Google
Checklist:
- OAuth app created in GitHub/Google
- Callback URLs correctly configured
- Environment variables set in Convex
- Application is using HTTPS (required for OAuth)
Debugging:
# Check Convex auth configuration
npx convex env list
# Should show:
# AUTH_GITHUB_ID=...
# AUTH_GITHUB_SECRET=...Symptom: Study timer runs but sessions don't appear in analytics
Possible Causes:
- Not authenticated - User must be signed in
- Convex connection lost - Check network tab in browser devtools
- Database mutation failed - Check Convex dashboard logs
Solution:
// Check browser console for errors
// Look for: "Failed to create study session"
// Verify user is authenticated
// Check: User profile icon appears in top-right- Switch to faster model: Select "Nova Lite" from model dropdown
- Check network latency:
ping us.inference.heroku.com
- Verify no rate limiting: Check Heroku dashboard for API limits
- Clear browser cache: Ctrl+Shift+R (Windows/Linux) or Cmd+Shift+R (Mac)
- Disable browser extensions: Test in incognito mode
- Check system resources: AI responses can be memory-intensive
If issues persist after trying these solutions:
-
Check Logs:
- Browser Console (F12 β Console tab)
- Convex Dashboard Logs
- Vercel Deployment Logs
-
Create Issue: GitHub Issues
- Include error messages
- Describe steps to reproduce
- Share environment details (OS, browser, Node version)
-
Community Support: Join discussions in GitHub Discussions
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature - Commit changes:
git commit -m 'Add amazing feature' - Push to branch:
git push origin feature/amazing-feature - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
| Technology | Version | Purpose |
|---|---|---|
| Next.js | 14.x | React framework with App Router |
| TypeScript | 5.x | Type-safe development |
| Tailwind CSS | 3.x | Utility-first styling |
| Shadcn/ui | Latest | Component library |
| Radix UI | Latest | Accessible primitives |
| Technology | Purpose |
|---|---|
| Convex | Real-time serverless database |
| Heroku Managed Inference | Multi-model AI hosting |
| Model Context Protocol (MCP) | External resource integration |
| Vercel | Frontend hosting and deployment |
| Model | Provider | Use Case |
|---|---|---|
| GPT-OSS 120B | Heroku | General-purpose queries |
| Nova Lite | Heroku | Fast responses |
| Nova Pro | Heroku | Balanced performance |
| Claude 3.5 Haiku | Anthropic (via Heroku) | Complex reasoning |
- Live Demo: study-mate.tech
- GitHub Repository: github.com/prime399/study-mate
- Report Issues: GitHub Issues
- Convex: docs.convex.dev
- Heroku Inference: devcenter.heroku.com/articles/heroku-inference
- MCP Protocol: GitHub MCP Examples
- Next.js: nextjs.org/docs
- Discussions: GitHub Discussions
- Feature Requests: GitHub Issues
This project is licensed under the MIT License - see the LICENSE file for complete details.
- Heroku for Managed Inference platform and MCP infrastructure
- Convex for real-time database and backend services
- Vercel for seamless frontend hosting
- Anthropic for Claude models
- Open-source community for various tools and libraries
StudyMate AI - Enterprise-grade study management powered by Heroku Managed Inference and Model Context Protocol.
Built for students worldwide. Transform your study habits with intelligent AI assistance.