[Feat] Implement metrics enhancements and testing scripts for issue #699 #830
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Issue #699 - Metrics Enhancements
Overview
This document outlines the changes made to address a subset of the requirements in issue #699, which involved enhancing the metrics functionality in MCP Gateway, and provides instructions for testing these enhancements.
Changes Implemented
1. Enhanced Metrics Calculation Functions
Updated the following functions in
mcpgateway/utils/metrics_common.py
:calculate_success_rate()
: Improved to handle edge cases such as:format_response_time()
: Enhanced to format response times with:2. Admin API Enhancements
Modified
mcpgateway/admin.py
to:3. Services Improvements
Updated metrics retrieval in service modules to support:
How to Test
Follow these steps to verify the metrics enhancements:
Prerequisites
pip install -e .
Testing Core Metrics Functions
Run the unit tests for the metrics functions:
This will verify:
Testing with Manual Data
Start the MCP Gateway server with admin features enabled:
Create test data using the provided script:
This creates test tools with different metrics patterns:
Test the admin UI:
Test CSV export:
Verification Criteria
The issue is considered resolved when:
Additional Notes
metrics_common.py
utility fileTest Files Location
tests/test_metrics_functions.py
tests/test_setup_data.py
Troubleshooting
If the server fails to start with database errors:
Newly added on Sep 9:
Prompt Usage Metrics
mcpgateway/services/prompt_service.py
_record_prompt_metric()
methodget_prompt()
method to record metrics with try-except-finally structureServer Interaction Metrics
mcpgateway/federation/forward.py
ServerMetric
importtime
module import_record_server_metric()
method to ForwardingService class_forward_to_gateway()
method to:time.monotonic()
🧪 TESTING RESULTS
Automated Test Results
Database Status
🚀 TECHNICAL IMPLEMENTATION DETAILS
1. Prompt Metrics Recording
2. Server Interaction Metrics Recording
3. Integration Points
get_prompt()
method ✅read_resource()
method ✅_forward_to_gateway()
method ✅📋 MANUAL TESTING INSTRUCTIONS
Prerequisites
Testing Steps
Screen Recording 2025-09-09 at 12.16.58 AM
🔍 SUCCESS CRITERIA STATUS UPDATE
✅ IMPLEMENTED (Code Level)
TESTING STATUS
=
📊 TEST DATA SOLUTION
RESOLVED: Created comprehensive test data in database:
🎯 CURRENT STATUS
CODE IMPLEMENTATION: COMPLETE ✅
All metrics recording functionality has been successfully implemented across all entity types.
FUNCTIONAL TESTING: READY ✅
Test data has been added to database to enable proper testing of metrics functionality.
REQUIREMENTS VERIFICATION
Ready for Testing: The functionality can now be properly tested with:
NEXT STEPS FOR COMPLETE VERIFICATION