Forums  > Pricing & Modelling  > What is better for quant modeling: CPU or GPU?  
     
Page 1 of 1
Display using:  

MatNat


Total Posts: 1
Joined: Jul 2022
 
Posted: 2022-07-01 11:37
Do you think GPU is a good technology for the finance/quant industry - particularly, pricing, risk modeling, XVA etc?
I've heard so many times from practitioners that a GPU is 1000x faster than a CPU. I'm trying to understand where this performance is coming from.

My rough calculations comparing top-of-the-line NVIDIA V100 GPU with the top offer from Intel seem to conclude that you'd need 2 NVIDIA machines to match the performance of the Intel Server (based on 2x Intel Xeon Platinum 9282 Processors).



Did I get something wrong?

Also, taking into account memory bandwidth, with 32GB for NVIDIA, that's only 1B float variables... so with 1000 Monte-Carlo Paths, 1000 trades, 1000 time points you reach the limit. On a CPU, having a few TB of RAM is the norm...

What is your experience with this?
Previous Thread :: Next Thread 
Page 1 of 1