LocalAI

LocalAI is the free, Open Source OpenAI alternative. This guide shows how to connect LocalAI to ONLYOFFICE editors on Linux.

Please note that installation via Docker is also possible. To learn more, please refer to the official Local AI guide on how to do that.

Step 1: Installing LocalAI

Hardware requirements
  • CPU: a multicore processor.
  • RAM: a minimum of 8 GB is required; 16 GB or more is recommended.
  • Storage: SSD storage is recommended. A minimum of 20 GB is required.
  • Network: LocalAI functions without an internet connection. However, a reliable connection is advised for downloading models and applying updates.
Installation
curl https://localai.io/install.sh | sh

Step 2: Installing the required model

Use the following command to install the required model:

local-ai models install name_of_the_model

where name_of_the_model is the name of the model you need.

Learn more about the available models on the official LocalAI website.

Step 3: Launching LocalAI with the cors flag

This step is necessary to use your AI assistant not only locally, but also in the web.

local-ai run --cors

Alternatively, you can add the following line to the /etc/localai.env file:

CORSALLOWEDORIGINS = "*"

Step 4: Configuring the AI plugin settings in ONLYOFFICE

  1. Please refer to our configuration guide for the initial setup.
  2. Now that you have connected your LocalAI to the ONLYOFFICE editors, specify LocalAI as a provider name.
  3. Enter the URL http://127.0.0.1:8080 if you did not change it when launching local-ai.
  4. The models will load automatically to the list – select the one you’ve chosen previously in step 2 of this guide.
  5. Click OK to save the settings.
    LocalAI: configuringLocalAI: configuring

Host ONLYOFFICE Docs on your own server or use it in the cloud

Article with the tag:
Browse all tags