🚀 Elevate Your Computing Game!
The NVIDIA HPE Tesla P40 24GB Computational Accelerator is a powerful, certified refurbished GPU designed for high-performance computing. With 12 TFlops of peak performance, 3840 cores, and 24GB of GDDR5 memory, it is perfect for demanding applications. Backed by a 90-day warranty, this accelerator is compatible with ProLiant DL380 Gen9 and XL190r systems, making it a reliable choice for professionals seeking top-tier computational power.
Compatible Devices | Desktop |
Graphics Ram Type | GDDR5 |
Graphics Coprocessor | NVIDIA Tesla P40 |
Graphics Card Ram | 24 GB |
D**N
Greatb24gb card for ai
Works great for AI. Cuda drivers are supported and installed easily. Only downside is that these cards are intended for servers so their is no fan. The card seems to be a bit over half the speed of my 3090's on lmstudio
M**.
needs aftermarket cooler to work
It does the job, but it's only about 1/3rd the speed of a 3060. I like that it has 24gb vram. I had to get aftermarket cooling to keep it under control.
N**B
Works with llm-cpp-python
It's great for llm chatgpt and what not. While it's only 11.76 TFLOPS FP32, 24G of vram is big help to load the model on. You need to pay a bit more for electricity. 250W! :)works with privategpt.
Z**S
One was good, One died right after return period.
The one that works is great, the other one gets "infoROM is corrupted at gpu". Worked just long enough to go past return date.
A**R
It works for me.
I installed it in a workstation, configured necessary software stack to do inference runs on LLM models such as LLAM 2 7b.
E**Y
perfecto
I originally thought it wasn't working or compatible with my server util i changed the cable. I just tried to order for another cable online just to try again and waoo it worked perfectly. Thanks you
A**R
Need to enable 4G in bios
Card did not work when first installed. Motherboard was showing a GPU fault. Found a post that said to enable 4G in the bios. Enabled that and card started working.
Trustpilot
2 weeks ago
1 month ago