-
Notifications
You must be signed in to change notification settings - Fork 0
Description
Part of: #663
Part of: #EPIC_NUMBER
[Conversation Reference: "Story 4: Log Export for External Analysis - As an administrator, I want to export filtered logs so that I can share them with support or analyze them externally"]
Story Overview
Objective: Implement log export functionality allowing administrators to download filtered/searched logs to file in JSON and CSV formats via Web UI, REST API, and MCP API.
User Value: Administrators can export logs for offline analysis, share with support teams, or import into external tools (spreadsheets, log analysis platforms) without manual copy-paste or direct database access.
Acceptance Criteria Summary: Export current filtered/searched logs; JSON and CSV format support; export via Web UI button, REST API endpoint, and MCP API tool.
Acceptance Criteria
AC1: Web UI Export Button
Scenario: Administrator exports logs from Web UI
Given logs are displayed in the Logs tab (with or without filters)
And an Export button is visible
When I click "Export"
Then a format selection appears (JSON or CSV)
And when I select a format
Then a file download begins
And the file contains all currently filtered log entriesTechnical Requirements:
- Add Export button to Logs tab UI
- Show format selection dropdown/modal (JSON, CSV)
- Trigger browser file download
- Include all current filters in export
- Filename includes timestamp: logs_YYYYMMDD_HHMMSS.json/csv
- Show loading indicator during export generation
- Handle large exports gracefully (streaming if needed)
AC2: JSON Export Format
Scenario: Administrator exports logs as JSON
Given I choose to export logs
When I select JSON format
Then the downloaded file is valid JSON
And contains an array of log objects
And each object has: timestamp, level, source, message, correlation_id, user_id, request_path
And the JSON is human-readable (formatted with indentation)Technical Requirements:
- Generate valid JSON array of log objects
- Include all log fields in each object
- Use ISO 8601 timestamp format
- Pretty-print with 2-space indentation
- Include metadata header with export timestamp and filter info
- UTF-8 encoding
AC3: CSV Export Format
Scenario: Administrator exports logs as CSV
Given I choose to export logs
When I select CSV format
Then the downloaded file is valid CSV
And has headers: timestamp, level, source, message, correlation_id, user_id, request_path
And each log entry is a row
And the CSV is Excel-compatibleTechnical Requirements:
- Generate valid CSV with header row
- Include all log fields as columns
- Properly escape special characters (commas, quotes, newlines)
- Use UTF-8 encoding with BOM for Excel compatibility
- Handle multi-line messages appropriately
- ISO 8601 timestamp format
AC4: REST API Export Endpoint
Scenario: Administrator exports logs via REST API
Given I have admin authentication credentials
When I send GET /admin/api/logs/export?format=json&search=SSO&level=ERROR
Then I receive a downloadable file
And the file contains logs matching the filter criteria
And Content-Disposition header triggers file downloadTechnical Requirements:
- Create GET /admin/api/logs/export endpoint
- Require admin authentication
- Support query parameters: format (json/csv), search, level, correlation_id
- Set Content-Type: application/json or text/csv
- Set Content-Disposition: attachment; filename="logs_YYYYMMDD_HHMMSS.format"
- Support date range filtering (optional enhancement)
- Stream large exports to avoid memory issues
AC5: MCP API Export Tool
Scenario: Administrator exports logs via MCP API
Given I have admin authentication credentials
When I call admin_logs_export tool with format="json" and filters
Then I receive the log data in requested format
And the data can be saved to a fileTechnical Requirements:
- Create admin_logs_export MCP tool
- Require admin credentials
- Support parameters: format, search, level, correlation_id
- Return raw data (JSON string or CSV string)
- For large exports, return data path instead of inline data
- Include export metadata (count, filters applied, timestamp)
AC6: Filtered Export Accuracy
Scenario: Export matches current view filters
Given I have applied filters (search="OAuth", level="ERROR")
And the UI shows 47 matching log entries
When I export the logs
Then the exported file contains exactly 47 entries
And all entries match the filter criteria
And no entries outside the filter are includedTechnical Requirements:
- Export respects all active filters
- Export includes filter metadata in output
- Web UI export passes current filter state to backend
- REST/MCP export uses same filter logic as query endpoints
- Verify export count matches UI count
Implementation Status
Progress Tracking:
- Core implementation complete
- Unit tests passing (X/Y tests)
- Integration tests passing (X/Y tests)
- E2E tests passing (X/Y tests)
- Code review approved
- Manual E2E testing completed by Claude Code
- Documentation updated
Completion: 0/Y tasks complete (0%)
Technical Implementation Details
Component Structure
src/cidx_server/
logging/
log_aggregator.py # Add export methods
export_formatter.py # NEW: JSON/CSV formatting
web/
routes.py # Add export download route
templates/
admin/
logs.html # Add Export button
_export_modal.html # NEW: Format selection modal
api/
admin_routes.py # Add /admin/api/logs/export endpoint
mcp/
admin_tools.py # Add admin_logs_export tool
Export Formatter Implementation
# export_formatter.py
import csv
import json
import io
from typing import List, Dict, Literal
class LogExportFormatter:
@staticmethod
def to_json(logs: List[Dict], filters: Dict) -> str:
"""Export logs as formatted JSON."""
export_data = {
"metadata": {
"exported_at": datetime.utcnow().isoformat(),
"filters": filters,
"count": len(logs)
},
"logs": logs
}
return json.dumps(export_data, indent=2, default=str)
@staticmethod
def to_csv(logs: List[Dict]) -> str:
"""Export logs as CSV with Excel-compatible encoding."""
output = io.StringIO()
writer = csv.DictWriter(
output,
fieldnames=['timestamp', 'level', 'source', 'message',
'correlation_id', 'user_id', 'request_path'],
extrasaction='ignore'
)
writer.writeheader()
writer.writerows(logs)
# Add UTF-8 BOM for Excel
return '\ufeff' + output.getvalue()REST API Export Endpoint
@router.get("/admin/api/logs/export")
async def export_logs(
format: Literal["json", "csv"] = "json",
search: Optional[str] = None,
level: Optional[str] = None,
correlation_id: Optional[str] = None,
current_user: User = Depends(get_admin_user)
):
"""Export filtered logs to file."""
# Query all matching logs (no pagination for export)
logs = log_aggregator.query_logs_all(
search=search,
levels=level.split(",") if level else None,
correlation_id=correlation_id
)
# Format output
formatter = LogExportFormatter()
if format == "json":
content = formatter.to_json(logs, filters={"search": search, "level": level})
media_type = "application/json"
else:
content = formatter.to_csv(logs)
media_type = "text/csv"
# Generate filename
timestamp = datetime.utcnow().strftime("%Y%m%d_%H%M%S")
filename = f"logs_{timestamp}.{format}"
return Response(
content=content,
media_type=media_type,
headers={
"Content-Disposition": f'attachment; filename="{filename}"'
}
)MCP Tool Implementation
@mcp_tool
async def admin_logs_export(
format: str = "json",
search: Optional[str] = None,
level: Optional[List[str]] = None,
correlation_id: Optional[str] = None
) -> Dict:
"""Export filtered logs in specified format."""
logs = log_aggregator.query_logs_all(
search=search,
levels=level,
correlation_id=correlation_id
)
formatter = LogExportFormatter()
if format == "json":
data = formatter.to_json(logs, filters={"search": search, "level": level})
else:
data = formatter.to_csv(logs)
return {
"format": format,
"count": len(logs),
"data": data,
"filters": {"search": search, "level": level, "correlation_id": correlation_id}
}JSON Export Format
{
"metadata": {
"exported_at": "2025-01-02T15:30:00Z",
"filters": {
"search": "SSO",
"level": "ERROR"
},
"count": 47
},
"logs": [
{
"timestamp": "2025-01-02T15:29:55Z",
"level": "ERROR",
"source": "auth.oidc",
"message": "SSO authentication failed: invalid token",
"correlation_id": "550e8400-e29b-41d4-a716-446655440000",
"user_id": "[email protected]",
"request_path": "/auth/sso/callback"
}
]
}CSV Export Format
timestamp,level,source,message,correlation_id,user_id,request_path
2025-01-02T15:29:55Z,ERROR,auth.oidc,"SSO authentication failed: invalid token",550e8400-e29b-41d4-a716-446655440000,[email protected],/auth/sso/callback
Testing Requirements
Unit Test Coverage
- LogExportFormatter.to_json produces valid JSON
- LogExportFormatter.to_json includes metadata
- LogExportFormatter.to_csv produces valid CSV
- LogExportFormatter.to_csv escapes special characters
- LogExportFormatter.to_csv includes BOM for Excel
- Empty log list produces valid empty export
Integration Test Coverage
- REST export endpoint returns correct Content-Type
- REST export endpoint returns correct Content-Disposition
- REST export with filters returns filtered data only
- MCP export tool returns data in requested format
- Large export (10000+ logs) completes successfully
E2E Test Coverage
- Filter logs in UI, click Export, verify file downloads
- Export as JSON, verify file is valid JSON
- Export as CSV, open in Excel, verify formatting
- Query REST export endpoint, verify downloadable file
- Call MCP export tool, verify data format
Performance Requirements
Response Time Targets
- Export 1000 logs: <5 seconds
- Export 10000 logs: <15 seconds
- Export generation: <10 seconds for 30 days of logs
Resource Requirements
- Memory: Stream large exports to avoid memory spikes
- Storage: Temporary file for very large exports (>100MB)
- Network: Chunked transfer encoding for large downloads
Error Handling Specifications
User-Friendly Error Messages
"Export failed. Please try again or contact support."
"No logs match your filters. Adjust filters before exporting."
"Export file too large. Please narrow your filter criteria."
Recovery Guidance
- Empty export: Inform user no data matches filters
- Export timeout: Suggest narrower date range or filters
- Format error: Suggest alternative format
Definition of Done
Functional Completion
- All acceptance criteria satisfied with evidence
- All technical requirements implemented
- Web UI Export button functional with format selection
- JSON export produces valid, formatted JSON
- CSV export produces Excel-compatible CSV
- REST API export endpoint functional
- MCP API export tool functional
- Exports respect all filter criteria
Quality Validation
- >90% test coverage achieved
- All tests passing (unit, integration, E2E)
- Code review approved
- Manual testing validated with evidence
- Performance benchmarks met
Integration Readiness
- Story delivers working, deployable software
- Full vertical slice implemented
- No broken functionality
- Documentation complete
Story Points: Medium
Priority: High (P2)
Dependencies: Story 1 (Log Viewing) and Story 2 (Search/Filtering) must be complete
Success Metric: Administrators can export filtered logs in JSON/CSV format via any interface in <10 seconds