This document helps you deploy liblab.ai on an Ubuntu 22.04 EC2 instance, using NGINX + Let's Encrypt TLS, and a systemd service to run the app reliably on boot.
-
Launch Ubuntu 22.04 (2 vCPU / 4–8 GB RAM, ~60 GB storage).
-
Security group: open ports 22 (SSH), 3000 (initial setup), and later 80/443 (HTTP/HTTPS).
-
SSH in:
ssh -i your-key.pem ubuntu@<EC2_PUBLIC_IP>
sudo apt update
sudo apt install -y docker.io docker-compose-plugin curl nginx certbot python3-certbot-nginx
sudo systemctl enable --now docker nginx
curl -fsSL https://deb.nodesource.com/setup_22.x | sudo -E bash -
sudo apt install -y nodejs
npm sudo install -g pnpmgit clone https://github.com/liblaber/ai.git
cd ai
pnpm run setup- Generates
AUTH_SECRET,ENCRYPTION_KEY - Prompts for LLM provider, model, API key (enforced format)
- Optionally captures
NETLIFY_AUTH_TOKEN
Note: If you put a domain or reverse proxy in front of the app, set BASE_URL in your .env to the public hostname (the domain you will use), e.g. BASE_URL=https://your-domain.com. This ensures generated links and redirects use the correct public URL instead of localhost or the instance IP.
docker compose -f docker-compose.db.yml up -dOr set DATABASE_URL to RDS in .env.
pnpm run quickstartVerify: http://<EC2_PUBLIC_IP>:3000/
a. Create NGINX site file:
sudo tee /etc/nginx/sites-available/liblab.ai <<EOF
server {
listen 80;
server_name your-domain.com;
location / {
proxy_pass http://localhost:3000;
proxy_set_header Host \$host;
proxy_set_header X-Real-IP \$remote_addr;
proxy_set_header X-Forwarded-For \$proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto \$scheme;
}
}
EOF
sudo ln -s /etc/nginx/sites-available/liblab.ai /etc/nginx/sites-enabled/
sudo nginx -t && sudo systemctl reload nginxb. Obtain & configure TLS with Certbot:
sudo certbot --nginx -d your-domain.comThis enables HTTPS and sets up auto-renewal.
sudo tee /etc/systemd/system/liblab-ai.service <<EOF
[Unit]
Description=liblab.ai service
After=network.target docker.service
Wants=docker.service
[Service]
WorkingDirectory=/home/ubuntu/ai
ExecStart=/usr/local/bin/pnpm run quickstart
Restart=on-failure
User=ubuntu
Environment=NODE_ENV=production
[Install]
WantedBy=multi-user.target
EOF
sudo systemctl daemon-reload
sudo systemctl enable liblab-ai
sudo systemctl start liblab-aiThis ensures liblab.ai auto-starts on reboot and restarts on failure.
| Step | Action |
|---|---|
| 1 | Launch EC2 + open ports 22 & 3000 |
| 2 | Install Docker, Node, pnpm, NGINX, Certbot |
| 3 | Clone & pnpm run setup (prompts for provider/key) |
| 4 | (Optional) Launch Postgres via Docker |
| 5 | pnpm run quickstart and test at port 3000 |
| 6 | Configure NGINX & execute Certbot to enable HTTPS |
| 7 | Add systemd unit for auto-start on reboot |
| 8 | Visit https://your-domain.com to access the app |
- Store secrets (API keys, JWT secrets) in AWS SSM or Secrets Manager and inject them at runtime.
- Use RDS Postgres and configure
DATABASE_URL. - Optionally deploy via ECS/Fargate for scaling.
- Set up logging and monitoring (e.g., CloudWatch, metrics dashboards).
- Restrict direct access to port 3000 via local interface only.