Is it possible to allocate actual Ram as VRAM for a GPU?

This option is possible if you are using an IGPU only
For discrete GPUs the only way to increase VRAM is buy GPU with more RAM
This option is possible if you are using an IGPU only
For discrete GPUs the only way to increase VRAM is buy GPU with more RAM
 
This happens already as and when needed.
Also it's useless, there are differences in DDR and GDDR RAMs which make DDR super slow for the tasks of a dGPU. But it's there to prevent from an OOM scenario.
Ever tried to play a recent AAA game at 2K or 4K with Ultra textures only on a 3-4GB VRAM card? The amount of stutter you're going to get explains the slowness when your VRAM is full and swapping happens between your system memory and VRAM.
 
Last edited:
I have a computer with 128 gigabytes of RAM, 3600MHz and I was wondering if I would be able to use like half of it as VRAM?

Card is 3080 Ti
Vram is always going to slow if loaned from DDR ram. Also, the shader decryption during gaming will increase cpu clock execution .. this is nothing, but it just stress the cpu.. 3080ti, has its on vram.. that shall be good enough for your need(single user).. although, I'm not sure if you have any other requirements..
 
Of course it's not gaming related but it was just a use-case example I wanted to let know that DDR != GDDR in terms of performance.

Well it is not a performance related issue. It's more of a: does GPU have any way (protocol for fellow nerds) to access system memory and the answer is NO. GPU has no way to access system memory over PCIe. However, it can be made to look like that it's possible, GPU's memory can be swapped out in system RAM to emulate this (caching/swapping is the technical term and is done by default by OS). It's just like swap partition/file on HDD to increase RAM (not actually ofc)

Do note that CPU does have access to GPUs ram and you could create ramdisks and swapdisks on GPU memory but unfortunately it can not be used as system ram due to non uniform and non standard access. System RAM has physically precise connection(s) with CPU just for that.

I guess this question is not gaming related but for large data sets for maybe ML training/application etc?

Cyberpunk 2077 at launch: Hold my beer :D
 
Last edited:
Back
Top