Skip to content

Send the GPT-SECRET-KEY to local LLMs / improve decision making between local and remote LLM? #578

Open
@DaiAoMori

Description

@DaiAoMori

I tried to connect Mantella to a local liteLLM proxy (basically to route requests directly to Anthropic without the need to go through OpenRouter).

I had my local liteLLM set up to use a manually created secret key, because why not. But for "some" reason, Mantella ignored the GPT-SECRET-KEY.txt, and just send "abc123".

The (obvious) solution was to change my self-generated key to abc123, but I still wondered why this is happening. I found this code fragment in "Mantella/tree/main/src/llm/client_base.py":

if 'https' in self._base_url: # Cloud LLM
            self._is_local: bool = False
            api_key = ClientBase._get_api_key(secret_key_files)
            if api_key:
                self._api_key = api_key
            else:
                self._api_key: str = 'abc123'
        else: # Local LLM
            self._is_local: bool = True
            self._api_key: str = 'abc123'

I wonder if there would be better options to figure out if a LLM is local - or if it would make even more sense to just always use the key configured in the secret key file, as opposed to falling back to "abc123".

Also I wonder if other behaviours depend on the _is_local property.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions