Skip to content

Conversation

@lopuhin
Copy link
Contributor

@lopuhin lopuhin commented Apr 5, 2025

Unfortunately they return logprobs in a different format, which require loading a tokenizer to work.

@lopuhin lopuhin requested a review from Copilot April 5, 2025 22:15
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copilot reviewed 3 out of 5 changed files in this pull request and generated no comments.

Files not reviewed (2)
  • docs/source/_notebooks/explain_llm_logprobs.rst: Language not supported
  • tox.ini: Language not supported
Comments suppressed due to low confidence (1)

eli5/llm/explain_prediction.py:127

  • Using assert here may be unsafe in production if Python is run in optimized mode. Consider replacing it with an explicit validation that raises a ValueError if the lengths do not match.
assert len(logprobs.token_logprobs) == len(logprobs.tokens)

@lopuhin lopuhin merged commit 34aca39 into master Apr 6, 2025
9 checks passed
@lopuhin lopuhin deleted the llm-mlx branch April 6, 2025 09:20
@lopuhin
Copy link
Contributor Author

lopuhin commented Apr 7, 2025

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants