-
-
Notifications
You must be signed in to change notification settings - Fork 4.1k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
What happened?
import litellm
from pprint import pprint
max_tokens = litellm.get_model_info("gpt-4")
pprint(max_tokens)
Now it returns max_input_tokens
and max_output_tokens
but the docstring is not mentioned this.
Return the same for huggingface
custom_llm_provider and update the docstring.
Relevant log output
{'input_cost_per_token': 3e-05,
'litellm_provider': 'openai',
'max_input_tokens': 8192,
'max_output_tokens': 4096,
'max_tokens': 4096,
'mode': 'chat',
'output_cost_per_token': 6e-05,
'supported_openai_params': ['frequency_penalty',
'logit_bias',
'logprobs',
'top_logprobs',
'max_tokens',
'n',
'presence_penalty',
'seed',
'stop',
'stream',
'stream_options',
'temperature',
'top_p',
'tools',
'tool_choice',
'function_call',
'functions',
'max_retries',
'extra_headers',
'user'],
'supports_function_calling': True}
Twitter / LinkedIn details
No response
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working