Skip to content

Ollama errors on older versions of Linux/GLIBC on 0.5.13 #9506

Open
@scomper

Description

@scomper

What is the issue?

After updating to Ollama 0.5.13, running it on CentOS Linux release 7.9.2009 (Core) results in the following errors:

ollama: /lib64/libm.so.6: version `GLIBC_2.27' not found (required by ollama)  
ollama: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.25' not found (required by ollama)  
ollama: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.20' not found (required by ollama)  
ollama: /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by ollama)  
ollama: /lib64/libstdc++.so.6: version `CXXABI_1.3.11' not found (required by ollama)  
ollama: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found (required by ollama)  
ollama: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.22' not found (required by ollama)  

This indicates that the current system's glibc and libstdc++ versions are too low to meet Ollama's dependencies. Could you please provide guidance on how to resolve this issue or consider adding support for older Linux distributions like CentOS 7?

Relevant log output

OS

CentOS Linux release 7.9.2009 (Core)

GPU

No response

CPU

No response

Ollama version

0.5.13

Activity

lfandrh

lfandrh commented on Mar 5, 2025

@lfandrh

I encountered the same problem. It seems that the glibc version does not meet the requirements(need 2.27), but the upgrade is complicated and will cause system instability.

leslie2046

leslie2046 commented on Mar 5, 2025

@leslie2046

+1

self-assigned this
on Mar 5, 2025
jmorganca

jmorganca commented on Mar 5, 2025

@jmorganca
Member

Hi folks, so sorry about the error . 0.5.13 requires a newer version of Linux/glibc (starting with centos/rhel 8) – we'll see if we can revisit this to lower the requirements

changed the title CentOS 7.9 encountered an error when running Ollama after updating from 0.5.12 to 0.5.13 Ollama errors on older versions of Linux/GLIBC on 0.5.13 on Mar 5, 2025
jmorganca

jmorganca commented on Mar 5, 2025

@jmorganca
Member

In the meantime the previous version can be downloaded with curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.5.12 sh

DirtyKnightForVi

DirtyKnightForVi commented on Mar 6, 2025

@DirtyKnightForVi

In the meantime the previous version can be downloaded with curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.5.12 sh

same errors. but i have to back to 0.5.10. maybe not the reasons that caused by version of Linux/glibc.

Yaunghow

Yaunghow commented on Mar 7, 2025

@Yaunghow

I have met the same problem.

duffybelfield

duffybelfield commented on Mar 7, 2025

@duffybelfield

+1

Yaunghow

Yaunghow commented on Mar 7, 2025

@Yaunghow

Hey, I found an alternative way to solve this problem by directly pulling a docker image of Ollama, without needing to change any system environment configurations. You can find more detail at: https://ollama.com/blog/ollama-is-now-available-as-an-official-docker-image. Hope this helps!

duffybelfield

duffybelfield commented on Mar 7, 2025

@duffybelfield

It's the same issue on the latest tag, I'm using 0.5.12 container.

brushknight

brushknight commented on Mar 10, 2025

@brushknight

I have a similar issue with latest ollama docker image

docker run  --rm -it --runtime=nvidia --gpus all ollama/ollama
Couldn't find '/root/.ollama/id_ed25519'. Generating new private key.
Your new public key is:

ssh-ed25519 Redacted

2025/03/10 11:59:12 routes.go:1215: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:2048 OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://0.0.0.0:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/root/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]"
time=2025-03-10T11:59:12.733Z level=INFO source=images.go:432 msg="total blobs: 0"
time=2025-03-10T11:59:12.733Z level=INFO source=images.go:439 msg="total unused blobs removed: 0"
time=2025-03-10T11:59:12.734Z level=INFO source=routes.go:1277 msg="Listening on [::]:11434 (version 0.5.13)"
time=2025-03-10T11:59:12.734Z level=INFO source=gpu.go:217 msg="looking for compatible GPUs"
time=2025-03-10T11:59:12.736Z level=INFO source=gpu.go:612 msg="Unable to load cudart library /usr/lib/aarch64-linux-gnu/nvidia/libcuda.so.1.1: Unable to load /usr/lib/aarch64-linux-gnu/nvidia/libcuda.so.1.1 library to query for Nvidia GPUs: /usr/lib/aarch64-linux-gnu/libc.so.6: version `GLIBC_2.34' not found (required by /usr/lib/aarch64-linux-gnu/nvidia/libnvrm_gpu.so)"
time=2025-03-10T11:59:12.738Z level=INFO source=gpu.go:377 msg="no compatible GPUs were discovered"
time=2025-03-10T11:59:12.738Z level=INFO source=types.go:130 msg="inference compute" id=0 library=cpu variant="" compute="" driver=0.0 name="" total="61.4 GiB" available="49.2 GiB"

Here is the error line

time=2025-03-10T11:59:12.736Z level=INFO source=gpu.go:612 msg="Unable to load cudart library /usr/lib/aarch64-linux-gnu/nvidia/libcuda.so.1.1: Unable to load /usr/lib/aarch64-linux-gnu/nvidia/libcuda.so.1.1 library to query for Nvidia GPUs: /usr/lib/aarch64-linux-gnu/libc.so.6: version `GLIBC_2.34' not found (required by /usr/lib/aarch64-linux-gnu/nvidia/libnvrm_gpu.so)"

Here is the host Glibc

ldd --version
ldd (Ubuntu GLIBC 2.35-0ubuntu3) 2.35
Copyright (C) 2022 Free Software Foundation, Inc.
This is free software; see the source for copying conditions.  There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
Written by Roland McGrath and Ulrich Drepper.

Running Ubuntu 22.04 with Orin AGX

cat /etc/os-release
PRETTY_NAME="Ubuntu 22.04 LTS"
NAME="Ubuntu"
VERSION_ID="22.04"
VERSION="22.04 (Jammy Jellyfish)"
VERSION_CODENAME=jammy
ID=ubuntu
ID_LIKE=debian
HOME_URL="https://www.ubuntu.com/"
SUPPORT_URL="https://help.ubuntu.com/"
BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/"
PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy"
UBUNTU_CODENAME=jammy

Do you have any recommendations how to resolve this issue without changing host environment? Should we build this container from sources on that machine or some other work arounds? Thank you in advance!

Updated:
The last image that works fine is 0.5.7 for arm64

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

    Participants

    @jmorganca@duffybelfield@brushknight@leslie2046@scomper

    Issue actions

      Ollama errors on older versions of Linux/GLIBC on 0.5.13 · Issue #9506 · ollama/ollama