Skip to content

System.TypeLoadException: 'Could not load type 'LLama.Native.NativeApi' from assembly 'LLamaSharp, #1119

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
CommanderLake opened this issue Mar 1, 2025 · 2 comments
Labels
stale Stale issue will be autoclosed soon

Comments

@CommanderLake
Copy link

CommanderLake commented Mar 1, 2025

I built my own llama.cpp b4743 which works fine with LM Studio but when loading a model with LLamaSharp i get this:
System.TypeLoadException: 'Could not load type 'LLama.Native.NativeApi' from assembly 'LLamaSharp, Version=0.0.0.0, Culture=neutral, PublicKeyToken=null' because the method 'llama_backend_free' has no implementation (no RVA).'

llama_backend_free exists with the correct signature and is properly exported and both are compiled as x64.

I am limited to Visual Studio 2017.

@martindevans
Copy link
Member

Any given version of LLamaSharp only works with exactly one version of llama.cpp. Check the table at the bottom of the readme for the exact version you need.

Copy link

This issue has been automatically marked as stale due to inactivity. If no further activity occurs, it will be closed in 7 days.

@github-actions github-actions bot added the stale Stale issue will be autoclosed soon label May 12, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
stale Stale issue will be autoclosed soon
Projects
None yet
Development

No branches or pull requests

2 participants