Skip to content

Conversation

@10234567Z
Copy link

  1. Removed exposed/fallback API Keys from nanda_adapter/core/agent_bridge.py and nanda_adapter/core/mcp_utils.py

  2. Abstract the LLM Provider to easily be able to add support for multiple types of agents in future.

  3. Adds HuggingFace Inference Provider Integration so that Hugging Face API Keys work too.

  4. Fixed requirement.txt i.e dependency listing file by removing duplicates and unused dependencies.

Summary: The code is now secure, optimized, abstract for future use and now even supports HuggingFace Inference for any person wanting to run low cost inference models.

Copilot AI review requested due to automatic review settings January 13, 2026 05:52
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds multi-provider LLM support to the NANDA framework, removes exposed API keys from code, and optimizes dependencies. The changes introduce an abstraction layer that allows using either Anthropic Claude or Hugging Face Inference API, improving flexibility and security.

Changes:

  • Created new llm_providers.py module with provider abstraction supporting Anthropic and HuggingFace
  • Removed hardcoded fallback API keys from agent_bridge.py and mcp_utils.py
  • Updated requirements.txt to remove duplicates and add huggingface_hub dependency
  • Modified nanda.py to initialize and configure LLM providers with new optional parameters
  • Refactored mcp_utils.py and agent_bridge.py to use provider abstraction instead of direct Anthropic SDK calls
  • Updated documentation in README.md with HuggingFace usage examples and configuration details

Reviewed changes

Copilot reviewed 7 out of 9 changed files in this pull request and generated 20 comments.

Show a summary per file
File Description
requirements.txt Removed duplicate anthropic and pymongo entries, added huggingface_hub
nanda_adapter/core/llm_providers.py New provider abstraction with AnthropicProvider and HuggingFaceProvider classes
nanda_adapter/core/nanda.py Added LLM provider initialization with new parameters for provider selection
nanda_adapter/core/mcp_utils.py Refactored to use provider abstraction, handles both dict and object response formats
nanda_adapter/core/agent_bridge.py Removed hardcoded API keys, updated to use provider abstraction, fixed typo
nanda_adapter/core/init.py Exported new LLM provider classes and functions
nanda_adapter/init.py Exported new LLM provider classes and functions
README.md Added HuggingFace documentation and updated environment variable documentation
.gitignore Added wildcard pattern for pycache directories

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +202 to +206
content.append({
"type": "tool_use",
"id": tool_call.id,
"name": tool_call.function.name,
"input": json.loads(tool_call.function.arguments) if isinstance(tool_call.function.arguments, str) else tool_call.function.arguments
Copy link

Copilot AI Jan 13, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The error handling for JSON parsing of tool call arguments could be improved. If json.loads fails but the arguments are already a dict, the code should handle this more robustly. Consider adding explicit error handling for the JSON parsing operation to prevent potential runtime errors when tool_call.function.arguments is neither a valid JSON string nor a dict.

Suggested change
content.append({
"type": "tool_use",
"id": tool_call.id,
"name": tool_call.function.name,
"input": json.loads(tool_call.function.arguments) if isinstance(tool_call.function.arguments, str) else tool_call.function.arguments
raw_arguments = tool_call.function.arguments
# Robustly handle tool call arguments which may be a JSON string, dict, or other types
if isinstance(raw_arguments, dict):
parsed_arguments = raw_arguments
elif isinstance(raw_arguments, str):
try:
parsed_arguments = json.loads(raw_arguments)
except Exception:
# Fall back to the raw string if JSON parsing fails
parsed_arguments = raw_arguments
else:
# For unexpected types, pass through as-is
parsed_arguments = raw_arguments
content.append({
"type": "tool_use",
"id": tool_call.id,
"name": tool_call.function.name,
"input": parsed_arguments

Copilot uses AI. Check for mistakes.
Comment on lines +178 to 181
if response_text:
# Log the LLM response
log_message(conversation_id, current_path, f"{provider.name} {agent_id}", response_text)

Copy link

Copilot AI Jan 13, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Inconsistent error handling for missing response_text. In the updated code at line 178, the function checks 'if response_text:' before logging, but still returns response_text regardless. If response_text is None (which can happen from provider.complete() failures), the function should return None explicitly or handle the None case more clearly for the caller.

Suggested change
if response_text:
# Log the LLM response
log_message(conversation_id, current_path, f"{provider.name} {agent_id}", response_text)
if response_text is None:
# Handle missing response_text from provider as a failure case
print(f"Agent {agent_id}: {provider.name} returned no response_text", flush=True)
return None
# Log the LLM response
log_message(conversation_id, current_path, f"{provider.name} {agent_id}", response_text)

Copilot uses AI. Check for mistakes.
Comment on lines +200 to +201
import traceback
traceback.print_exc()
Copy link

Copilot AI Jan 13, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The traceback module is used at line 201 but is not imported at the top of the file. This will cause a NameError at runtime when an exception occurs. Add 'import traceback' to the imports section at the top of the file.

Copilot uses AI. Check for mistakes.
content = response.get("content", [])

# Handle different content formats (Anthropic raw vs unified)
if hasattr(content, '__iter__') and not isinstance(content, (str, dict)):
Copy link

Copilot AI Jan 13, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Potential issue with iteration check. The condition 'hasattr(content, 'iter')' will be True for strings since strings are iterable in Python. The additional check 'not isinstance(content, (str, dict))' prevents this, but the logic could be clearer. Consider simplifying this to check if content is a list directly: 'isinstance(content, list)' would be more explicit and readable.

Suggested change
if hasattr(content, '__iter__') and not isinstance(content, (str, dict)):
if isinstance(content, list):

Copilot uses AI. Check for mistakes.
Comment on lines +183 to +186
if hasattr(content, '__iter__') and not isinstance(content, (str, dict)):
blocks = content
else:
blocks = [content] if content else []
Copy link

Copilot AI Jan 13, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Duplicate code pattern for content iteration. The same logic for checking if content is iterable and converting it to blocks appears at lines 110-113 and again at lines 183-186. This duplicated logic should be extracted into a helper function to improve maintainability.

Copilot uses AI. Check for mistakes.
Comment on lines +19 to +22
from .llm_providers import get_provider, init_provider
from .mcp_utils import MCPClient
except ImportError:
from llm_providers import get_provider, init_provider
Copy link

Copilot AI Jan 13, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Import of 'init_provider' is not used.

Suggested change
from .llm_providers import get_provider, init_provider
from .mcp_utils import MCPClient
except ImportError:
from llm_providers import get_provider, init_provider
from .llm_providers import get_provider
from .mcp_utils import MCPClient
except ImportError:
from llm_providers import get_provider

Copilot uses AI. Check for mistakes.
Comment on lines +19 to +22
from .llm_providers import get_provider, init_provider
from .mcp_utils import MCPClient
except ImportError:
from llm_providers import get_provider, init_provider
Copy link

Copilot AI Jan 13, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Import of 'init_provider' is not used.

Suggested change
from .llm_providers import get_provider, init_provider
from .mcp_utils import MCPClient
except ImportError:
from llm_providers import get_provider, init_provider
from .llm_providers import get_provider
from .mcp_utils import MCPClient
except ImportError:
from llm_providers import get_provider

Copilot uses AI. Check for mistakes.
Comment on lines 21 to 28
from .llm_providers import init_provider, get_provider
except ImportError:
# If running from parent directory, add current directory to path
current_dir = os.path.dirname(os.path.abspath(__file__))
sys.path.insert(0, current_dir)
from agent_bridge import *
import run_ui_agent_https
from llm_providers import init_provider, get_provider
Copy link

Copilot AI Jan 13, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Import of 'get_provider' is not used.

Suggested change
from .llm_providers import init_provider, get_provider
except ImportError:
# If running from parent directory, add current directory to path
current_dir = os.path.dirname(os.path.abspath(__file__))
sys.path.insert(0, current_dir)
from agent_bridge import *
import run_ui_agent_https
from llm_providers import init_provider, get_provider
from .llm_providers import init_provider
except ImportError:
# If running from parent directory, add current directory to path
current_dir = os.path.dirname(os.path.abspath(__file__))
sys.path.insert(0, current_dir)
from agent_bridge import *
import run_ui_agent_https
from llm_providers import init_provider

Copilot uses AI. Check for mistakes.
SMITHERY_API_KEY = os.getenv("SMITHERY_API_KEY") or "bfcb8cec-9d56-4957-8156-bced0bfca532"
SMITHERY_API_KEY = os.getenv("SMITHERY_API_KEY")
if not SMITHERY_API_KEY:
print("WARNING: SMITHERY_API_KEY not set - Smithery MCP servers will not work")
Copy link

Copilot AI Jan 13, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Print statement may execute during import.

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant