Blog

Use Cloudflare Tunnels to Access your Homelab

In our last blog post, we looked at how to set up an AI server on a Mac Mini and how to access the server in our homelab. In today's post, we will make the server available through Cloudflare Tunnels, allowing us to access it from anywhere.

Read more

Mac mini as AI Server

In today's blog post, we will explore how to run a local AI server on a Mac mini. We will use Ollama to run a large language model locally and Open WebUI as the web interface to access Ollama.

Read more
RSS Feed