Skip to content

Allow sideloading backends on filesystem #5917

@arenekosreal

Description

@arenekosreal

Is your feature request related to a problem? Please describe.

I am trying to package LocalAI to archlinux, so LocalAI can be installed and run directly on archlinux with its files managed by pacman. But I find it is hard to package those backends like llama-cpp, piper, bark, etc. LocalAI supports installing custom backends by using ocifile protocol, but this requires us using tools like docker/podman/buildah, etc to create an archive, and run command manually to install. And libraries dependencies of tarred backends cannot be tracked easily because backends files are in a tar file, they cannot be checked directly by packaging checking tools.

Describe the solution you'd like

Allow us sideloading backends existed on filesystem automatically seems to be fine to solve this. For example, we can load llama-cpp backend at /path/to/local-ai/backend/cpp/llama-cpp/package/ if required files exist, /path/to/local-ai is the directory contains local-ai binary itself. As you can see, if /path/to/local-ai is where the git repository is, this path is where built backends files are before sending to docker. This means that we can even run built local-ai and those backends directly in LocalAI's git repository, it may be easier and quicker for us to check changes to those backends when developing. But you cannot assume the sideloaded backend directory is writable, because they may be the path like /usr/lib/local-ai/backend/cpp/llama-cpp/package/, only root has permission to modify this path, and this path should be managed by package manager on normal Linux. So the metadata.json may need to be writen to other writable place. My current thought is that we still write metadata.json to ./backends/llama-cpp@sideloaded, but there is a new key named sideload_path with value /path/to/local-ai/backends/cpp/llama-cpp/package/.

Describe alternatives you've considered

N/A

Additional context

This does not mean we should remove ocifile protocol install method. Actually, it behaves very well when we need to install backend in special scenario like without network.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions