Intel Discrete GPU DG1 launched

Intel has now launched its first DG1 discrete GPU in three laptops from Acer, ASUS, and Dell. It is mostly identical to the inbuilt GPU present in Tiger Lake iGPU with some differentiations, notably dedicated 4GB LPDDR4X-4266 memory on 128 bit bus as well as a slight overlock of 1650MHz compared to 1350MHz on Tiger Lake iGPU.

Specifications:
  • 96 EUs (768 ALUs)
  • 1650MHz
  • 2.46 TFLOPs
On gaming front it competes with the likes of Nvidia MX350 & claims to provide similar performance.

The ace up the sleeve with this GPU is in content creation applications where Intel's workload sharing feature, called Deep Link, kicks in. This essentially allows the inbuilt GPU to work with the discrete GPU providing addition performance benefits. According to Intel, this allows DG1 to perform 7 times faster than the Nvidia MX350. It can also run Hyper Encode 78% faster than an Nvidia RTX 2080 Super Max-Q.

This looks interesting if it can provide the productivity benefits in other applications. Intel is planning to provide support to software developers to take advantage of this so support should improve in the future. What are your thoughts on this?

Sources:
Intel’s Discrete GPU Era Begins: Intel Launches Iris Xe MAX For Entry-Level Laptops
Intel introduces the Iris Xe Max: a discrete GPU that merges with integrated GPUs
 
Good to know.

For Developers like us, this will be interesting considering Tensorflow and Pytorch are locked with NVIDIA CUDA Architecture and no final closure getting for AMD GPUs (in terms of compatibility with CUDA/inter-operable libraries) and same happens with Game Devs too. So it will be difficult position for Many Organizations but if it improves and give better bang for all of us - it will be definitely win for all.
 
Good to know.

For Developers like us, this will be interesting considering Tensorflow and Pytorch are locked with NVIDIA CUDA Architecture and no final closure getting for AMD GPUs (in terms of compatibility with CUDA/inter-operable libraries) and same happens with Game Devs too. So it will be difficult position for Many Organizations but if it improves and give better bang for all of us - it will be definitely win for all.
Yup, couldn't agree more. Software support has to be there for it succeed. If it does succeed, it will be interesting to have thin laptops with good battery life, due to the lower TDP ~ 25W, which could handle productivity tasks easily & should also cost significantly less compared to a similar setup with something like an RTX 2080 Super MaxQ.
 
This seems to be a waste of time I think. This chip competes with MX350 which is EOL'ed already. NVIDIA already has an MX450 now in the same 25w power envelope based on turing which has much better efficiency. This chip is basically a 1650 with lower clocks and half the bus width so in theory it should be way way faster. So I wonder who they are competing with.
 
I wonder how stable such a CPU+GPU hybrid design will be thermally. Both run hot at full loads even when discrete and have their own cooling systems. This hybrid will touch 60C+ easily.
 
This seems to be a waste of time I think. This chip competes with MX350 which is EOL'ed already. NVIDIA already has an MX450 now in the same 25w power envelope based on turing which has much better efficiency. This chip is basically a 1650 with lower clocks and half the bus width so in theory it should be way way faster. So I wonder who they are competing with.
This is not a gaming chip per se. It is essentially marketed as a productivity chip where it excels & it also happen to do equivalent to MX350 in gaming. I don't have very high hopes for Intel's gaming offerings to be honest but their gaming offerings are expected to come at a later time & should be a lot better.
I wonder how stable such a CPU+GPU hybrid design will be thermally. Both run hot at full loads even when discrete and have their own cooling systems. This hybrid will touch 60C+ easily.
It should be equivalent to any Intel CPU + MX250 offering.
 
They are targeting it towards content creators - YouTube video encoding etc. Such encoding stuff runs hot for 10+ minutes continuously even on normal cards. Plus I'm worried about drivers. Will bad drivers crash the entire system and make it unbootable?
 
They are targeting it towards content creators - YouTube video encoding etc. Such encoding stuff runs hot for 10+ minutes continuously even on normal cards. Plus I'm worried about drivers. Will bad drivers crash the entire system and make it unbootable?
I think temperatures will not be an issue depending upon the laptop manufacturer as even existing setup with way power hungry components can easily run games for hours continuously. Drivers are anyone's guess at this point of time but why do you think "bad drivers crash the entire system and make it unbootable"? I don't think drivers will be in such a state & even if they are it will be stable by the time of mass adoption. Lets wait for reviews on the 3 units that are expected to be available soon.
 
The ace up the sleeve with this GPU is in content creation applications where Intel's workload sharing feature, called Deep Link, kicks in. This essentially allows the inbuilt GPU to work with the discrete GPU providing addition performance benefits.
Uh...AMD Llano anyone? Same tech different names? Or am I wrong?
Also isn't the MX350 an entry level model? How is that impressive in any way? o_O
 
Uh...AMD Llano anyone? Same tech different names? Or am I wrong?
Also isn't the MX350 an entry level model? How is that impressive in any way? o_O
Just read the whole paragraph that you have highlighted & you will see how its impressive. Nobody said its impressive in gaming performance.

The ace up the sleeve with this GPU is in content creation applications where Intel's workload sharing feature, called Deep Link, kicks in. This essentially allows the inbuilt GPU to work with the discrete GPU providing addition performance benefits. According to Intel, this allows DG1 to perform 7 times faster than the Nvidia MX350. It can also run Hyper Encode 78% faster than an Nvidia RTX 2080 Super Max-Q.
 
My point was with all the hype and hullabaloo none of these are worth to be mainstream news. I mean "ace in the sleeve"? "MX350"? Really? When others already gone and done why should people even fall for Intel? Not only that none of these are proven just yet and Intel loves to exaggerate the performances of their products. That is all.
So nothing here is impressive and worth making news. If anything they seem to just want a piece of the PR pie with all the hype Zen 3 and RTX/RX are making for themselves.
 
My point was with all the hype and hullabaloo none of these are worth to be mainstream news. I mean "ace in the sleeve"? "MX350"? Really? When others already gone and done why should people even opt for Intel?
I don't think there is much hype surrounding this launch. There might be for the future discrete GPUs DG2 & DG3 but this launch is not at all hyped by Intel or by anyone.

Regarding the "ace in the sleeve" comment & comparison with MX350, I don't see how 7 times faster than MX 350 & 78% faster than an Nvidia RTX 2080 Super Max-Q in specific productivity workloads at 25W TDP is not news worthy? Its not like they are claiming to destroy the GPU market with this launch. And you are right its not worth to be mainstream news but it is a technology news nonetheless that some of us might be interested in.

Not only that none of these are proven just yet and Intel loves to exaggerate the performances of their products. That is all.
So nothing here is impressive and worth making news. If anything they seem to just want a piece of the PR pie with all the hype Zen 3 and RTX/RX are making for themselves.
That's all very true, they need to stay in news somehow & could be lying through their teeth about performance...
 
Last edited:
Back
Top