Close
Showing results 1 to 10 of 10
  1. #1

    Default Combine and Ati and an Nvidia GPU?

    WUT? Sounds amazing!
    Currently 5 Boxing 5 Protection Paladins on Whisperwind Alliance
    The Power of Five!!! ( short video )

  2. #2
    Member Ughmahedhurtz's Avatar
    Join Date
    Jul 2007
    Location
    North of The Wall, South of The Line
    Posts
    7169

    Default

    Not sure why you would want to do this.
    Now playing: WoW (Garona)

  3. #3

    Default

    Frankly I believe that MSFT feels it necessary to jerk around things because they arent a significant player in the field (god forbid they were one), it might give them a reason to represent a compatibility layer and make hardware vendors re certify their gear against some sort of moving target that will be deprecated before long.

    The crossfire/sli without the memory mirroring (mem duplicate on each card) and more performance seems bs.

  4. #4

    Default

    Quote Originally Posted by Ughmahedhurtz View Post
    Not sure why you would want to do this.
    Maybe i misunderstood it.. but it sounds like DX12 harnesses more of the power of multi GPUs, and ALL of the combined ram, than currently.. The combining of multiple types of GPUs is a side thing to me.
    Currently 5 Boxing 5 Protection Paladins on Whisperwind Alliance
    The Power of Five!!! ( short video )

  5. #5
    Member Ughmahedhurtz's Avatar
    Join Date
    Jul 2007
    Location
    North of The Wall, South of The Line
    Posts
    7169

    Default

    Quote Originally Posted by Lyonheart View Post
    Maybe i misunderstood it.. but it sounds like DX12 harnesses more of the power of multi GPUs, and ALL of the combined ram, than currently.. The combining of multiple types of GPUs is a side thing to me.
    I suppose it depends on how they plan to optimize things. How would you put two GPUs in the same system where they supported different features? Least common denominator? How would they get the timing right so you got smooth framerates? I've been out of the video driver business for some years but unless they're turning GPUs into GPGPUs, I don't see how this benefits a video card manufacturer in terms of standing out from the crowd. NVidia and ATI have their own "special sauce" e.g. PhysX, EyeFinity, CUDA, etc.

    It'll be interesting to see what the point is once we have some better clues, and what NVidia and ATI have to say about it.
    Now playing: WoW (Garona)

  6. #6
    Multiboxologist MiRai's Avatar
    Join Date
    Apr 2009
    Location
    Winter Is Coming
    Posts
    6815

    Default

    Quote Originally Posted by Lyonheart View Post
    Maybe i misunderstood it.. but it sounds like DX12 harnesses more of the power of multi GPUs, and ALL of the combined ram, than currently.. The combining of multiple types of GPUs is a side thing to me.
    Assuming it's not just marketing hype BS, the developer is going to have to use their own time to implement these things and it's not something that's going to "just work" when a game comes standard with DX12 (which is likely going to be years from now anyway) -- We're talking both multi-GPU between different vendors and adding the VRAM together. Having glanced at a few threads about all of this, I read, perhaps, the perfect quote:

    You make it sound like having a multi-GPU setup's performance limited to whichever vendor's card is slowest for a given scene, featureset limited to the intersection of card features both could support reliably, on drivers never meant to work in concert with their competitors, with a PC installed with two proprietary driver packages with separate update methods and schedules, drivers that frequently have problems with conflicts with older versions of themselves, with different .NET or other environmental requirements, on hardware that was not designed/developed/tested in the presence of a competitor that frequently has problems with differing hardware from the same vendor, with low-level differences in execution behavior, little history of cooperative implementations, multiple render paths in an engine that were never designed/coded/tested to run simultaneously, in a platform that has at best problematically supported switching between one GPU or the other in mobile, between vendors who have every interest and an ongoing history of getting in each other's way could lead to undesirable outcomes.
    Do not send me a PM if what you want to talk about isn't absolutely private.
    Ask your questions on the forum where others can also benefit from the information.

    Author of the almost unknown and heavily neglected blog: Multiboxology

  7. #7
    Member Ughmahedhurtz's Avatar
    Join Date
    Jul 2007
    Location
    North of The Wall, South of The Line
    Posts
    7169

    Default

    I was trying to not come off just straight negative, but that post pretty much sums up what I was thinking.
    Now playing: WoW (Garona)

  8. #8
    Multiboxologist MiRai's Avatar
    Join Date
    Apr 2009
    Location
    Winter Is Coming
    Posts
    6815

    Default

    Quote Originally Posted by Ughmahedhurtz View Post
    I was trying to not come off just straight negative, but that post pretty much sums up what I was thinking.
    Haha, I didn't mean to sound like a dick, but I find it so odd that something like this is even being marketed with DX12, and it's almost as if Microsoft was just pulling ideas from a hat labeled "things that just don't make any sense and nobody's asking for."

    "I really wish I could buy an AMD GPU to run in SLI with my current GPU," said no nVidia GPU user ever. (the reverse is also 100% true for any AMD GPU user).
    Do not send me a PM if what you want to talk about isn't absolutely private.
    Ask your questions on the forum where others can also benefit from the information.

    Author of the almost unknown and heavily neglected blog: Multiboxology

  9. #9

    Default

    For me it was less about combining an AMD and Nvidia GPU and more about DX12 being better than current DX at utilizing more of multi GPUs, and all of the ram. IF it can utilize all the ram of two different GPUs, then it can do it with with two of the same? In other words..it sounds like DX12 will be an improvement to how multi GPUs work. Im mean right now if you have two GPUs with 3G ram each in SLI or Crossfire, you still only have 3 total right?

    Im not sure anyone would use two different GPUs, even if they could. It sounded unbelievable, as you guys have shared, but it wasn't that part of it that impressed me.. i should have made that clear in my OP.
    Currently 5 Boxing 5 Protection Paladins on Whisperwind Alliance
    The Power of Five!!! ( short video )

  10. #10
    Multiboxologist MiRai's Avatar
    Join Date
    Apr 2009
    Location
    Winter Is Coming
    Posts
    6815

    Default

    Quote Originally Posted by Lyonheart View Post
    For me it was less about combining an AMD and Nvidia GPU and more about DX12 being better than current DX at utilizing more of multi GPUs, and all of the ram. IF it can utilize all the ram of two different GPUs, then it can do it with with two of the same?
    Right, DX12 is supposedly going to allow VRAM to be added together, but only if the game company/publisher/devs want to support it. It's not just going to be a switch that they're going to have to turn on, and they're going to have to designate assets in their game to be rendered on a particular GPU. So, the way that I understand it is that GPU1 would render things like trees, foliage, and shadows, while GPU2 rendered character models, particle effects, etc. But what happens when you add a third GPU? Or a fourth? What happens when one GPU becomes overloaded with whatever it's assigned to? Does it offload that to the other GPU(s) or does it tank the frame rate?

    Honestly, it sounds like it could easily end up being a nightmare to implement such a system into a game like World of Warcraft. Blizzard would not only have to upgrade WoW's engine to support DX12, which is likely years away, they'd also (assumingly) have to directly support SLI in their game -- something they haven't done ever -- but who knows, maybe it's not as difficult as I see it in my non-programmer mind.

    Don't get me wrong, I'm looking forward to seeing the next version of DirectX in action, but it sorta seems like Microsoft went into panic mode when AMD released Mantle and it began to gain traction, and now everything new we've been hearing over the past few weeks about DX12 are these crazy in-a-perfect-fantasy-world features that don't really matter all that much. All I want is better performance, more efficiency, and fancier graphical features, and that's it.

    Quote Originally Posted by Lyonheart View Post
    In other words..it sounds like DX12 will be an improvement to how multi GPUs work. Im mean right now if you have two GPUs with 3G ram each in SLI or Crossfire, you still only have 3 total right?
    Correct, but the 900 series GPUs brought 4GB, and the generation after that (later this year, hopefully) is going to bring at least 4GB, if not more (not counting any Titan-like GPUs) -- GPU VRAM is only going to keep going up.
    Do not send me a PM if what you want to talk about isn't absolutely private.
    Ask your questions on the forum where others can also benefit from the information.

    Author of the almost unknown and heavily neglected blog: Multiboxology

Posting Rules

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •