Evil Avatar  



Go Back   Evil Avatar > Daily Gaming News > News Items

» Sponsored Links


» Recent Threads
Shadow of the Tomb...
Last post by Rommel
Today 09:49 PM
6 Replies, 145 Views
Rockstar Begins Selling...
Last post by vallor
Today 09:15 PM
86 Replies, 2,256 Views
Blizzard Begins...
Last post by Evil Avatar
Today 05:57 PM
22 Replies, 1,612 Views
Close to the Sun to be...
Last post by Terran
Today 05:50 PM
2 Replies, 131 Views
Liberals gone wild
Last post by SpectralThundr
Today 05:44 PM
2,957 Replies, 406,322 Views
Kingdom Hearts III -...
Last post by Evil Avatar
Today 05:31 PM
0 Replies, 115 Views
A small request
Last post by Terran
Today 02:12 PM
41 Replies, 1,569 Views
Game & Movie Releases...
Last post by Chief Smash
Today 11:56 AM
9 Replies, 568 Views
Reply
 
Thread Tools
Old 10-27-2015, 11:56 AM   #1
brandonjclark
Subscriber
 
brandonjclark's Avatar
 
Join Date: Dec 2007
Posts: 10,649
AMD and Nvidia work better together in DX12


In an amazing article and DX12 test written by Ryan Smith over at Anandtech, AMD and Nvidia cards are put to use at the same time courtesy of DX12's explicit multi-adapter technology. This allows GPU's from multiple vendors to be used concurrently. It's an absolutely excellent write up and I encourage you to read it beginning to end.

Now, for the amazing part....

Quote:
The most surprising thing about all of this is that the greatest gains are with the mixed GPU setups. It’s not immediately clear why this is – if there’s something more efficient about having each vendor and their drivers operating one GPU instead of two – or if AMD and NVIDIA are just more compatible than either company cares to admit. Either way this shows that even with Ashes’ basic AFR implementation, multi-adapter rendering is working and working well. Meanwhile the one outlier, as we briefly discussed before, is the dual NVIDIA setup, which just doesn’t scale quite as well.
That's right folks. In DX12 (Ashes of the Singularity) mixed GPU's perform better than other multi-GPU setups from the same vendor.

Could DX12 spell the end of the GPU Fanboy?

Last edited by brandonjclark; 10-27-2015 at 12:25 PM.. Reason: syntax
brandonjclark is offline   Reply With Quote
Old 10-27-2015, 12:12 PM   #2
ashikenshin
Evil Dead
 
ashikenshin's Avatar
 
Join Date: Jul 2005
Posts: 3,235
Yeah I read this the other day and it sounded awesome!
ashikenshin is offline   Reply With Quote
Old 10-27-2015, 12:18 PM   #3
BadIronTree
Evil Dead
 
BadIronTree's Avatar
 
Join Date: Mar 2005
Posts: 604
The sad part is that with AMD first and Nvidia second card the performance is epic... but when you put the Nvidia first and the AMD second you get 80% less per. than the single card...

Fury X Master and 980 slave for ME
__________________
Join Star Citizen and get 5,000 UEC Free
http://robertsspaceindustries.com/en...STAR-K6CZ-XQPD
BadIronTree is offline   Reply With Quote
Old 10-27-2015, 12:27 PM   #4
brandonjclark
Subscriber
 
brandonjclark's Avatar
 
Join Date: Dec 2007
Posts: 10,649
Quote:
Originally Posted by BadIronTree View Post
The sad part is that with AMD first and Nvidia second card the performance is epic... but when you put the Nvidia first and the AMD second you get 80% less per. than the single card...

Fury X Master and 980 slave for ME
I think the article was recently updated because when I read it the other day it stated that it didn't matter which one was first; they both outperformed either single-card setup.
__________________
~B$
Gamertag: legisilverback | Steam Nickname: brandonjclark
...playing Rebel Galaxy
brandonjclark is offline   Reply With Quote
Old 10-27-2015, 01:02 PM   #5
Rommel
International Playboy
 
Rommel's Avatar
 
Join Date: Feb 2005
Location: Hong Kong
Posts: 10,204
AMD is already showing better DX12 performance. Pure luck on that but true.
Rommel is online now   Reply With Quote
Old 10-27-2015, 01:02 PM   #6
screwyluie
Evil Dead
 
screwyluie's Avatar
 
Join Date: Dec 2005
Location: Washington
Posts: 2,423
difference between amd/nvidia and nvidia/amd was only a couple FPS not 80%: http://anandtech.com/show/9740/direc...mgpu-preview/4

The shitty thing about all this is it's entirely up to the game devs to code it in. Which means it could be sporadic to nonexistent... and if that's the case no one is going to mix gpus unless they just happen to have one laying around and happen to own one of the few games that support it.

they need to push to make this a standard... create a dead simple tool/suite to add the code to any game so it's no brainer for any dev to support it... I dunno, they need to do something because there's little incentive for a game studio to spend time/money on it as is.
screwyluie is offline   Reply With Quote
Old 10-27-2015, 01:19 PM   #7
ElektroDragon
Evil Dead
 
ElektroDragon's Avatar
 
Join Date: Feb 2007
Location: Seattle area
Posts: 10,551
"AMD and Nvidia cards are put to use at the same time "

And who the hell is going to buy both and put them in the same system?

That's like voting for Donald Trump AND Bernie Sanders.
__________________
Proud to be perma-banned 5 times on NeoGAF.
ElektroDragon is offline   Reply With Quote
Old 10-27-2015, 01:34 PM   #8
Emabulator
The Voight-Kampff
 
Emabulator's Avatar
 
Join Date: Feb 2005
Location: The Garden State
Posts: 31,192
Blog Entries: 58
Quote:
Originally Posted by brandonjclark View Post
Could DX12 spell the end of the GPU Fanboy?
No, because Radeon will still suck!
Emabulator is offline   Reply With Quote
Old 10-27-2015, 01:48 PM   #9
Suicidal ShiZuru
Mutual Hatred
 
Suicidal ShiZuru's Avatar
 
Join Date: May 2005
Location: Right on the Beach
Posts: 10,155
Blog Entries: 1
The results will change and as screwyluie said it's really up to the devs to support this. For me this means nothing as I want my G Sync and will never give it up. By the time DX12 actually rolls out things may have changed drastically.
__________________
"This game raped my wife, slaughtered my son and left me for dead."
Suicidal ShiZuru is offline   Reply With Quote
Old 10-27-2015, 04:46 PM   #10
screwyluie
Evil Dead
 
screwyluie's Avatar
 
Join Date: Dec 2005
Location: Washington
Posts: 2,423
Quote:
Originally Posted by ElektroDragon View Post
"AMD and Nvidia cards are put to use at the same time "

And who the hell is going to buy both and put them in the same system?

That's like voting for Donald Trump AND Bernie Sanders.
if 99% of games have this feature, why not? what if, as these tests show, getting one from each vendor is the best performance? What if you buy an nvidia when you build the system but later that year there's a killer deal on an AMD card?

the most likely use is an old card. These tests show that pairing an old card (perhaps from you previous build, or upgrade) with a new one still improves performance. I know I have a few cards laying around that I would throw in a system if it means more performance.
screwyluie is offline   Reply With Quote
Old 10-27-2015, 04:46 PM   #11
Skookum
Reaper
 
Join Date: Feb 2005
Posts: 184
Quote:
Originally Posted by Suicidal ShiZuru View Post
The results will change and as screwyluie said it's really up to the devs to support this. For me this means nothing as I want my G Sync and will never give it up. By the time DX12 actually rolls out things may have changed drastically.
Yes, that's a good point. I have a G sync monitor as well, if I get a Radeon card to go with my 980, will DX12 let me use it?
Skookum is offline   Reply With Quote
Old 10-27-2015, 04:51 PM   #12
screwyluie
Evil Dead
 
screwyluie's Avatar
 
Join Date: Dec 2005
Location: Washington
Posts: 2,423
Quote:
Originally Posted by Skookum View Post
Yes, that's a good point. I have a G sync monitor as well, if I get a Radeon card to go with my 980, will DX12 let me use it?
that is one of the many questions I want answered.
screwyluie is offline   Reply With Quote
Old 10-27-2015, 04:55 PM   #13
Suicidal ShiZuru
Mutual Hatred
 
Suicidal ShiZuru's Avatar
 
Join Date: May 2005
Location: Right on the Beach
Posts: 10,155
Blog Entries: 1
Quote:
Originally Posted by screwyluie View Post
that is one of the many questions I want answered.
As of right now it's not even a question and the answer is no. G Sync not only relies on its hardware but the proprietary drivers that Nvidia have locked away. These designs can't even be "hacked" any further than the possibility of tricking an Nvidia driver to believe a Freesync display is G Sync capable. The point of that would be to unlock the "software G Sync" that some people believe is all there is to it which from what I understand is incorrect unless Nvidia are completely lying to the world. People seem to forget that the original G Sync release was a hardware module that you installed yourself inside the Asus VG248QE monitor.
__________________
"This game raped my wife, slaughtered my son and left me for dead."

Last edited by Suicidal ShiZuru; 10-27-2015 at 05:12 PM..
Suicidal ShiZuru is offline   Reply With Quote
Old 10-27-2015, 05:14 PM   #14
screwyluie
Evil Dead
 
screwyluie's Avatar
 
Join Date: Dec 2005
Location: Washington
Posts: 2,423
you're tracking the wrong path, that is not at all where this topic was going. I'm well versed in how gsync works and why it's better all around than freesync.

however this demo clearly shows that the primary GPU matters, which means there's a possibility the primary GPU could retain all it's properties, like gameworks and gsync with the AMD as the slave.

so yeah, it is a question. I want to see someone try it.
screwyluie is offline   Reply With Quote
Old 10-27-2015, 06:32 PM   #15
Suicidal ShiZuru
Mutual Hatred
 
Suicidal ShiZuru's Avatar
 
Join Date: May 2005
Location: Right on the Beach
Posts: 10,155
Blog Entries: 1
It's a tech demo and this version of the multi adapter is the worst one which relies more heavily on drivers instead of pulling straight from the hardware as the better versions aim to do. As I already said before things may change drastically by the time Direct X 12 is actually being used... You can't view this as a real world representation of anything since this version of Ashes was made specifically for this demonstration. Realistically using this method can best help with low end hardware performance by combing integrated and discrete processing power.
__________________
"This game raped my wife, slaughtered my son and left me for dead."
Suicidal ShiZuru is offline   Reply With Quote
Old 10-27-2015, 08:11 PM   #16
Skookum
Reaper
 
Join Date: Feb 2005
Posts: 184
Quote:
Originally Posted by Suicidal ShiZuru View Post
You can't view this as a real world representation of anything since this version of Ashes was made specifically for this demonstration.
That's also a good point . It's not like DX12 and a cheaper Radeon is going to make Witcher 3 play any better on my system (and admittedly, it's running pretty damn good right now). This will be good only for games that are made to take advantage of the second card. Reminds me of Glide vs. OpenGL.
Skookum is offline   Reply With Quote
Old 10-27-2015, 11:23 PM   #17
sai tyrus
Renegade Paragon
 
sai tyrus's Avatar
 
Join Date: May 2011
Location: all over
Posts: 8,859
Blog Entries: 81
Glad you posted this. Was going to after this stretch of work. This seems like magic, especially if developers take advantage of this. Would be fantastic from a consumer's perspective!
__________________
Steam: sai tyrus

"I don't know half of you half as well as I should like, and I like less than half of you half as well as you deserve!" - Bilbo Baggins
sai tyrus is offline   Reply With Quote
Old 10-28-2015, 01:13 AM   #18
92miata
Evil Dead
 
92miata's Avatar
 
Join Date: Apr 2005
Location: Prescott, AZ
Posts: 1,283
https://www.youtube.com/watch?v=xKBttQmhDBw
92miata is offline   Reply With Quote
Reply

Tags
amd, ashes of the singularity, dx12, mixed-gpu, nvidia

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -7. The time now is 10:08 PM.