Docs
  • Getting Started
    • Monitoring & Alerting
    • Path based routing
    • Staging Environments
    • Custom Domains
    • CI Pipelines
    • Zero Downtime Releases
    • Horizontal Scaling with replicas
    • Deployment modes
    • Environment Variables
    • Troubleshooting Deployments
    • Using the IDE
    • Code Reviews
    • Pricing: Hosting and Seats
    • Installing OS-level packages using Nix
    • Speedrun: Deploying an app
  • Templates
    • FlowiseAI - Low Code LLM Apps Builder
    • Llama2 API
    • Llama2.cpp Chat
    • Jupyter Notebook
    • Stable Diffusion Text2Img
    • Text generation web UI
Powered by GitBook
On this page
  • Steps to set up
  • Use a self-hosted LLM with Flowise
  1. Templates

FlowiseAI - Low Code LLM Apps Builder

PreviousSpeedrun: Deploying an appNextLlama2 API

Last updated 1 year ago

Steps to set up

  1. Open the CI-pipeline

  2. Hit the Prepare Stage

  3. Set up these 4 environmental variables:

    • DATABASE_PATH = /home/user/app

    • DATABASE_TYPE = sqlite

    • FLOWISE_USER =

    • FLOWISE_PASSWORD =

  4. After the Run stage is finished hit the Run stage

  5. Click on Open deployment

Use a self-hosted LLM with Flowise

You can self-host a LLM on Arkane Cloud which you can use for your FlowiseAI project. Here you can find the Llama API template:

https://ide.arkanecloud.com/docs/templates/llama2-api