-
Notifications
You must be signed in to change notification settings - Fork 1
Open
Description
09:06:48 ⬢ [Docker] ➜ ./bin/llama-bench -n 20 -p 512 -r 50 -m /home/lichenguang25/.ollama/models/blobs/sha256-6f96e01a3f550ca08aea1e5725bb8d5a7eccc6f281c30417e9d380b8c46467bd -ngl 99
| model | size | params | backend | ngl | test | t/s |
| ------------------------------ | ---------: | ---------: | ---------- | --: | --------------: | -------------------: |
/home/lichenguang25/github/llama.cpp/ggml/src/ggml-cann/ggml-cann.cpp:67: CANN error
[New LWP 649047]
[New LWP 649046]
[New LWP 649045]
[New LWP 649044]
[New LWP 649043]
[New LWP 649042]
[New LWP 649041]
[New LWP 649040]
[New LWP 648955]
[New LWP 648954]
[New LWP 648953]
[New LWP 648952]
[New LWP 648951]
[New LWP 648932]
[New LWP 648931]
[New LWP 648930]
This GDB supports auto-downloading debuginfo from the following URLs:
<https://debuginfod.ubuntu.com>
Enable debuginfod for this session? (y or [n]) [answered N; input not from terminal]
Debuginfod has been disabled.
To make this setting permanent, add 'set debuginfod enabled off' to .gdbinit.
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib/aarch64-linux-gnu/libthread_db.so.1".
0x0000ffffb11a7ab4 in wait4 () from /lib/aarch64-linux-gnu/libc.so.6
#0 0x0000ffffb11a7ab4 in wait4 () from /lib/aarch64-linux-gnu/libc.so.6
#1 0x0000ffffb16254fc in ggml_print_backtrace () from /home/lichenguang25/github/llama.cpp/build/bin/libggml-base.so.0
#2 0x0000ffffb16256a0 in ggml_abort () from /home/lichenguang25/github/llama.cpp/build/bin/libggml-base.so.0
#3 0x0000ffffb0f29f4c in ggml_cann_error(char const*, char const*, char const*, int, char const*) () from /home/lichenguang25/github/llama.cpp/build/bin/libggml-cann.so.0
#4 0x0000ffffb0f1cd88 in ggml_cann_rope(ggml_backend_cann_context&, ggml_tensor*) () from /home/lichenguang25/github/llama.cpp/build/bin/libggml-cann.so.0
#5 0x0000ffffb0f31994 in ggml_backend_cann_graph_compute(ggml_backend*, ggml_cgraph*) () from /home/lichenguang25/github/llama.cpp/build/bin/libggml-cann.so.0
#6 0x0000ffffb16401a4 in ggml_backend_sched_graph_compute_async () from /home/lichenguang25/github/llama.cpp/build/bin/libggml-base.so.0
#7 0x0000ffffb177dbdc in llama_context::graph_compute(ggml_cgraph*, bool) () from /home/lichenguang25/github/llama.cpp/build/bin/libllama.so.0
#8 0x0000ffffb177f640 in llama_context::process_ubatch(llama_ubatch const&, llm_graph_type, llama_memory_context_i*, ggml_status&) () from /home/lichenguang25/github/llama.cpp/build/bin/libllama.so.0
#9 0x0000ffffb1787060 in llama_context::decode(llama_batch const&) () from /home/lichenguang25/github/llama.cpp/build/bin/libllama.so.0
#10 0x0000ffffb17889c0 in llama_decode () from /home/lichenguang25/github/llama.cpp/build/bin/libllama.so.0
#11 0x0000aaaab6e57168 in test_prompt(llama_context*, int, int, int) ()
#12 0x0000aaaab6e530a0 in main ()
[Inferior 1 (process 648929) detached]
Aborted
Metadata
Metadata
Assignees
Labels
No labels