How to Setup Deepseek Locally on Ubuntu 22.04 Server With Ollama

Step-by-step guide on How to Setup Deepseek Locally on Ubuntu 22.04 Server With Ollama

In this guide we are going to learn how to set up Deepseek Locally or in a Server on Ubuntu 22.04 with Ollama. Aimed at developers, researchers, and organizations looking for more control and privacy in AI-driven applications, Ollama facilitates the seamless deployment and management of LLMs on personal systems or within private networks.

We will use open web ui to access our deepseek. Open WebUI is an extensible, self-hosted AI interface that adapts to your workflow, all while operating entirely offline.

Prerequisites

  • Ubuntu 22.04 server up and running
  • If the server is remote, you need to be able to ssh to it
  • Sudo access to the server
  • Server should have at 8GB memory, 16GB recommended
  • Server should have at least 4CPUs, the more the better

Ensure server is up to date

Before proceeding, ensure your server is up to date.

1
2
sudo apt update
sudo apt upgrade -y

Set hostname in the server. Mine will be ai-beast.citizix.com.

1
sudo hostnamectl set-hostname ai-beast.citizix.com

Edit /etc/hosts and add the hostname to fix the name resolution for the hostname

1
sudo vim /etc/hosts

Update this line

1
127.0.0.1 localhost ai-beast.citizix.com ai-beast

Install python and git

Install python, git, pip. These are needed for ollama and open-webui to run.

1
sudo apt install python3 python3-pip git -y

Confirm that the versions installed are correct

1
2
3
4
5
6
7
8
$ python3 --version
Python 3.12.3

$ pip3 --version
pip 24.0 from /usr/lib/python3/dist-packages/pip (python 3.12)

$ git --version
git version 2.43.0

Install ollama

Use this command to install ollama

1
curl -fsSL https://ollama.com/install.sh | sh

Then confirm installation:

1
ollama --version

Start the ollama service

1
sudo systemctl start ollama

Confirm ollama is running

status

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
$ sudo systemctl status ollama
● ollama.service - Ollama Service
     Loaded: loaded (/etc/systemd/system/ollama.service; enabled; preset: enabled)
     Active: active (running) since Tue 2025-02-18 08:05:43 UTC; 58s ago
   Main PID: 21515 (ollama)
      Tasks: 7 (limit: 4586)
     Memory: 30.8M (peak: 31.2M)
        CPU: 80ms
     CGroup: /system.slice/ollama.service
             └─21515 /usr/local/bin/ollama serve

Feb 18 08:05:43 ai-beast.citizix.com ollama[21515]: Couldn't find '/usr/share/ollama/.ollama/id_ed25519'. Generating new private key.
Feb 18 08:05:43 ai-beast.citizix.com ollama[21515]: Your new public key is:
Feb 18 08:05:43 ai-beast.citizix.com ollama[21515]: ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGgjIKB86+V3H5Fs8dFiOeryo5kiMCqDAySLlqFa26e5
Feb 18 08:05:43 ai-beast.citizix.com ollama[21515]: 2025/02/18 08:05:43 routes.go:1186: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_>
Feb 18 08:05:43 ai-beast.citizix.com ollama[21515]: time=2025-02-18T08:05:43.307Z level=INFO source=images.go:432 msg="total blobs: 0"
Feb 18 08:05:43 ai-beast.citizix.com ollama[21515]: time=2025-02-18T08:05:43.308Z level=INFO source=images.go:439 msg="total unused blobs removed: 0"
Feb 18 08:05:43 ai-beast.citizix.com ollama[21515]: time=2025-02-18T08:05:43.309Z level=INFO source=routes.go:1237 msg="Listening on 127.0.0.1:11434 (version 0.5.11)"
Feb 18 08:05:43 ai-beast.citizix.com ollama[21515]: time=2025-02-18T08:05:43.316Z level=INFO source=gpu.go:217 msg="looking for compatible GPUs"
Feb 18 08:05:43 ai-beast.citizix.com ollama[21515]: time=2025-02-18T08:05:43.329Z level=INFO source=gpu.go:377 msg="no compatible GPUs were discovered"
Feb 18 08:05:43 ai-beast.citizix.com ollama[21515]: time=2025-02-18T08:05:43.329Z level=INFO source=types.go:130 msg="inference compute" id=0 library=cpu variant="" compu>

Enable ollama service to start on boot.

1
sudo systemctl enable ollama

Download deepseek model

Download and run deepseek model - DeepSeek-R1-Distill-Qwen-7B

1
ollama run deepseek-r1:7b

Exit the prompt with Ctrl+d

List available models

1
ollama list

output

1
2
3
4
$ ollama list

NAME              ID              SIZE      MODIFIED
deepseek-r1:7b    0a8c26691023    4.7 GB    27 minutes ago

To check more models, check in the ollama models page

Set up open web ui for deepseek

Open Web UI is python based. That means we will need a python environment to set it up. We will use virtualenv.

First install python virtualenv package

1
sudo apt install python3-venv -y

Then create a virtualenv in the path ~/open-webui-venv that we can use

1
python3 -m venv ~/open-webui-venv

Finally activate the virtualenv

1
source ~/open-webui-venv/bin/activate

Install open web ui

1
pip install open-webui

Start openweb ui

1
open-webui serve

This will spin up the opem webui service, accessible at http://localhost:8080 or http://server_ip:8080.

Accessing Open Web UI

Once set up, you can load the UI in the browser. After the welcome screen, please proceed to do the following:

  • create admin a/c when prompted
  • select the model you install from drop down
  • start interracting with deepseek

Conclusion

In this guide, we managed to set up our own version of deepseek locally and set up openwebui for web access.

comments powered by Disqus
Citizix Ltd
Built with Hugo
Theme Stack designed by Jimmy