Replies: 10 comments 41 replies
-
Same for me, the last Version that worked was koboldcpp_rocm4allV3.exe. Any other version just throws Access violation errors loading any model if I use Hipblas. Probably rocm 5.7 related issues. Maybe "built by deleting the entire stock AMD ROCm 5.7.1 GPU code folder and replacing it with only ROCm-4-All-5.7.1 Tensile Library files before compiling." would help. |
Beta Was this translation helpful? Give feedback.
-
Update: Probably not a rocm issue. Didn't realize that 6.1 already dropped. Same issue here. But I think the problem could be Windows 11 itself after all. I noticed that after I download it, the Windows Defender Smart Screen issues a warning for the app not being trusted. It works after I allow it once. If I close it and start again, I don't get any warnings anymore and I only get Access violation. |
Beta Was this translation helpful? Give feedback.
-
1.70 also doesn't work properly. Even running it via Python in Windows gives me the same access violation at 0x0000000 loading any model. The .exe at least works once after the download. Still don't know what the issue is. |
Beta Was this translation helpful? Give feedback.
-
Can confirm. 1.70 yields the same violation errors loading models in Win11 23H2 as all previous models going back to v1.68. 1.67.0 Rocm4Allv3 was the last version to work on Windows. |
Beta Was this translation helpful? Give feedback.
-
The same here. I noticed some support for ROCM under WSL now, so I tried to run it that way as well. While I didn't get the access violation message, it would just freeze, forcing me to kill the process. I'm using a 7900 XTX if that helps any. I have tried using DDU to remove and reinstall my GPU drivers, but that didn't seem to help. I was wondering if trying the pro drivers might make any difference. Little added info, this was with the most recent AMD drivers as of today (7/21/2024) , 24.7.1 on Windows 11 Pro 23H2 |
Beta Was this translation helpful? Give feedback.
-
strange, but my llama3 launches normally on win11 with the latest updates and drivers. In any case, I'm here to ask how you are with launching llama3.1? I haven't experimented with the settings yet, but 3.1 always crashes. |
Beta Was this translation helpful? Give feedback.
-
Just an Info. Newest Version "KoboldCPP-v1.75.2.yr1-ROCm" seem to work! |
Beta Was this translation helpful? Give feedback.
-
I still cannot run any version higher than "koboldcpp_v1.72.yr0-rocm_6.1.2.exe" on my 7900XT... I dont understand why only the version you've tagged as "IGNORE" is the only version that has actually worked since 1.67.yr0_rocm4allV3.exe... If you have any reasoning for this or know what I can do to use newer versions, I'd be incredibly appreciative. |
Beta Was this translation helpful? Give feedback.
-
For what it's worth I'm using I'm on Windows 11 Pro 24H2. |
Beta Was this translation helpful? Give feedback.
-
This has been an ongoing problem since 1.68.0, but I had hoped somebody would mention it, as I was fighting not to add 2FA to my account, thus was not allowed to comment or leave feedback. But there has been several releases and not a soul has mentioned it, so I finally gave in to report this problem.
On Windows, no version after 1.67.0 actually loads Llama3/gguf models. It always quickly returns an access violation error on the screen and immediately closes upon loading. Im currently running windows 11 23H2, on a Ryzen 7700X w/ 32GB Ram and a 7900 XT w/ 20GB Vram. All of the 1.67.0 releases run perfectly for me. But every version after that crashes when I go to run a modern model.
Beta Was this translation helpful? Give feedback.
All reactions