Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error: connect ECONNREFUSED 127.0.0.1:11434] #27

Open
LTtt456c opened this issue Dec 22, 2023 · 3 comments
Open

Error: connect ECONNREFUSED 127.0.0.1:11434] #27

LTtt456c opened this issue Dec 22, 2023 · 3 comments

Comments

@LTtt456c
Copy link

Hello everyone!
My ollama in My docker
docker Start ollama command is docker run -e OLLAMA_HOST=0.0.0.0:11434 -d -v ollama serve -p 11434:11434 --name ollama ollama/ollama
Then I in vscode open chatbot-ollama And then input npm run dev And then Report an error

↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓ Here is the error log ↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓↓

PS G:\AI\chatbot-ollama> npm run dev

[email protected] dev
next dev

▲ Next.js 13.5.6

✓ Ready in 2.9s
○ Compiling / ...
✓ Compiled / in 3.3s (1652 modules)
⚠ Fast Refresh had to perform a full reload. Read more: https://nextjs.org/docs/messages/fast-refresh-reload
✓ Compiled in 1699ms (1652 modules)
✓ Compiled in 519ms (1652 modules)
✓ Compiled /api/models in 245ms (68 modules)
[TypeError: fetch failed] {
cause: [Error: connect ECONNREFUSED 127.0.0.1:11434] {
errno: -4078,
code: 'ECONNREFUSED',
syscall: 'connect',
address: '127.0.0.1',
port: 11434
}
}
✓ Compiled in 620ms (1720 modules)
[TypeError: fetch failed] {
cause: [Error: connect ECONNREFUSED 127.0.0.1:11434] {
errno: -4078,
code: 'ECONNREFUSED',
syscall: 'connect',
address: '127.0.0.1',
port: 11434
}
}
[TypeError: fetch failed] {
cause: [Error: connect ECONNREFUSED 127.0.0.1:11434] {
errno: -4078,
code: 'ECONNREFUSED',
syscall: 'connect',
address: '127.0.0.1',
port: 11434
}
}
[TypeError: fetch failed] {
cause: [Error: connect ECONNREFUSED 127.0.0.1:11434] {
errno: -4078,
code: 'ECONNREFUSED',
syscall: 'connect',
address: '127.0.0.1',
port: 11434
}
}

@ahmad-alkadri
Copy link

ahmad-alkadri commented Jan 14, 2024

Couldn't get the ollama docker command you gave running; are you sure it's the right command? Have you checked if your ollama container is running?

FYI according to the official dockerhub for ollama, the command for running its docker container (CPU only) would be:

docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama

I'd suggest you to read up the documentation on getting ollama up and running on docker first.

@Dispa1r
Copy link

Dispa1r commented Mar 8, 2024

i have the same problem just on ubuntu 22.04 with docker, both add --add-host=host.docker.internal:host-gateway or --add-host=host.docker.internal:host-gateway doesn't work for me : (

@Dispa1r
Copy link

Dispa1r commented Mar 20, 2024

i have the same problem just on ubuntu 22.04 with docker, both add --add-host=host.docker.internal:host-gateway or --add-host=host.docker.internal:host-gateway doesn't work for me : (

at last, i chose to deploy the chatollama locally with yarn, and it works 😅

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants