Close
Showing results 1 to 9 of 9
  1. #1

    Default Going to get one of these when they come out

    http://wccftech.com/nvidia-geforce-t...mples-shipped/

    6GB on one card and it supposedly outperforms a 690 by a sizable margin.

    Also interested in this stuff:

    http://www.tomshardware.com/news/G.S...ntX,20901.html

    but really it depends on what kind of performance increase I would see. I can't wait for the end of the month!

  2. #2
    Multiboxologist MiRai's Avatar
    Join Date
    Apr 2009
    Location
    Winter Is Coming
    Posts
    6815

    Default

    Quote Originally Posted by Multibocks View Post
    http://wccftech.com/nvidia-geforce-t...mples-shipped/

    6GB on one card and it supposedly outperforms a 690 by a sizable margin.
    My money is on this:
    Exact specifications of the consumer graphics solutions based on Nvidia’s most powerful GPU ever are still unclear, but Nvidia is reportedly looking forward to deliver 85% of dual-chip GeForce GTX 690 performance with the novelty.
    NVidia isn't going to release a more powerful GPU for less money because they like to milk their customers for everything they've got. Just look at that $900 price tag. =\

    I read this post the other day on the Anandtech forums and I liked it so much that I bookmarked it, so I'll share it:

    Quote Originally Posted by RussianSensation
    Quote Originally Posted by tviceman
    hopefully a Geforce-based GK110 can get that 50-60% performance improvement and stay <= 250 watts. I think it's possible.

    Still though, $900 is outrageous under any circumstance,
    Ya, I agree with your entire post. Even though people bring comparison of high-end headphones, and other hobbies, etc. GPUs are totally different because they have a limited shelf-life. If you drop $1-2K on high-end headphones or an amp, they don't go "obsolete" in 3 years. They will still sound stellar every single day for 10 years. You can now get an HD7850 for $150-160 that with a 10 min overclock will get you GTX580 level of performance. GTX580 was $500 just 2 years ago. Comparing the Titan to GTX690 misses the point because GTX690's level of performance should be more affordable now. Using this logic, GTX680 could have cost 35% more than GTX580 because it offered 35% more performance. If NV keeps doing this, soon they'll condition PC enthusiasts to believe that a high-end GPU's normal price is $900-1000. Even cards like GTX590/HD6990 were $700-750.

    If we expect HD8970 to be 15-20% faster than HD7970GE, GK114 (GTX780) shouldn't be far behind. I think NV locked voltage control on GK104 so that GTX780's 20% increase looks good. Also, I expect GTX780's voltage control to be locked because then NV could charge a large premium for the Titan and 780 won't be able to touch it. Delaying GTX780 to June or later would allow NV to sell Titan at the highest prices to excited PC enthusiasts who are ready to spend $900-$1,000 on flagship GPUs after NV conditioned them with $1,000 GTX690 in 2012. I must say I am very impressed with NV's marketing. Rumors of GTX700 being postponed but Titan still on track would allow the Titan to look much faster than it is since it'll be compared to the 680 not 780. Interesting marketing game NV is playing. It's like a copy of Apple's handbook 101. The best part is NV managed to disguise it all by somehow making us gamers think it was AMD that raised prices. All AMD did is bring HD7970 back to ATI's historical levels. That was a steep price increase from HD4870/4890/5870/6970 days but not unexpected given AMD's dire financial situation, struggling to maintain 15% gross margins. What NV is doing is shifting price levels entirely into stratosphere, rumoured to be selling a 550mm2 die for $900, while reporting > 50% gross margin on its earnings calls. Essentially they are asking us to absorb higher 28nm wafer costs. No thanks. GPU tech is also supposed to get cheaper for a given level of performance or faster at a similar price level. Titan at $900 does not deliver on either, which sounds like it's overpriced on the general price/performance tech curve. I guess I am sitting still until 20nm Maxwell / HD9000 series gives Titan's performance for $499. :biggrin:
    It's probably a little over the top for over here, but I really liked what he said about NVidia's marketing.

  3. #3

    Default

    So even if it is only 85% of a 690, that's still way faster than a single 680. I'm going to see how the results turn out, but all the reports I have read says 115% of a 690. Also, the rumors are that the 7xx series are delayed yet again (yes it could be just nVidia starting those themselves to make money) and won't come out til 4th quarter 2013. I'd rather buy the Titan now, with that rumor.

    edit: you didn't comment on the Gskill RAM though! 2800 Mhz!


    editX2: Jesus

    http://wccftech.com/asus-geforce-tit...unch-imminent/
    Last edited by Multibocks : 02-07-2013 at 11:41 PM

  4. #4
    Multiboxologist MiRai's Avatar
    Join Date
    Apr 2009
    Location
    Winter Is Coming
    Posts
    6815

    Default

    Quote Originally Posted by Multibocks View Post
    edit: you didn't comment on the Gskill RAM though! 2800 Mhz!
    Alright, well it's probably going to cost an arm and a leg and be completely unnecessary.

    To run at 2800MHz you're first going to need a motherboard that can handle that and then you're going to have to pump a lot of voltage into the memory controller for stability. Looking at the larger screenshot it's most likely going to require a whopping 1.65v to reach those speeds which can damage the memory controller (the link is for Ivy, but also holds true for Sandy).

    As for the timings, 11-13-13-35 aren't phenomenal, but the raw speed (MHz) is going to make it fast regardless. So, let's do some quick math as per this post on the Anandtech forums:
    Let's just do the math, the frequency is expressed in Hertz, which means "cycles per second". So, the DDR3 2133 will perform 2133 cycles a second while the DDR3 1600 will do, well, 1600. You, of course, know this.

    Now the CAS latency is given in cycles. So, a CAS8 DIMM will take 8 cycles to respond and the CAS11, 11 cycles.

    Now putting it all together - the DDR3 2133 CAS11 will take 11/2133 seconds, which is equal to 0.00516 seconds, to respond while the DDR3 1600 CAS 8 will take 8/1600, which is equal to 0,005 seconds, to respond. Thus, the 1600 DIMM is faster. For your dilemma, you're contemplating 0.000516 versus 0.00422 or timing difference of 0.00094 seconds! How fast are your reflexes and how long are you willing to wait for your memory to respond? Yes, I'm being an ass. You'll never ever see, feel or sense a difference.

    You'll also want to carefully check the timings, since the higher the MHz the looser they are. I've seen 2133 DIMMS with 11-14-28-30, simply junk. As well as voltage, avoid anything higher than 1.5V. The only way many manufacturers get a respectable CAS at high MHz is to crank the voltage to 1.65V. Hence is why the 1.35V Samsung DIMMS are so smashing. Also, less DIMMs generally means less load to the memory controller. Therefore, 2x8GB is better than 4x4GB.
    11/2800 = 0.00392 seconds

    Looking through some 32GB RAM kits on Newegg there isn't anything as fast (and that was expected because nothing comes close to 2800MHz), but while staying within the voltage standard of 1.5v, you can find these which are 7-8-8-24 @ 1600MHz (and have an access time of 0.00437 seconds (7/1600)) or these which are 9-9-9-24 @ 1866MHz (and have an access time of 0.00482 seconds (9/1866)). Both of these kits I've listed will work on 90% of motherboards on the market today without any funny business, while that new G.Skill kit may not.

    So, the 2800MHz kit will be 0.00045 seconds faster than the 1600MHz RAM which safely stays within Intel's specifications. Are you going to notice that difference outside of synthetic benchmarks? Nope. Are you willing to shorten the lifespan of your CPU by pumping 1.65v into the memory controller for a 450 millionth of a second increase in speed?

    I'll close with a quote from a DDR3 memory scaling article on Anandtech:
    The results weren't very stimulating, were they? Just as expected, gaming with faster memory just doesn't make any notable difference. I could have potentially lowered the resolution and settings in an attempt to produce some sort of difference, but I felt that testing these games at the settings they're most likely to be played at was far more enlightening. If you want better gaming performance, the GPU is the best component to upgrade—no news there.
    EDIT: To add, here's a list on Newegg of the 32GB memory kits running at 2400MHz (all 1.65v). Looking at those prices, I would assume that G.SKill's new kit would cost close to $400 (if not more).
    Last edited by MiRai : 02-08-2013 at 08:31 AM

  5. #5

    Default

    That's really weird. I swear I saw you make a statement that faster RAM really helped out us boxers.

  6. #6
    Multiboxologist MiRai's Avatar
    Join Date
    Apr 2009
    Location
    Winter Is Coming
    Posts
    6815

    Default

    Quote Originally Posted by Multibocks View Post
    That's really weird. I swear I saw you make a statement that faster RAM really helped out us boxers.
    Hmmm... I think the most recent statement I had about RAM speeds was that the faster the RAM is the faster a RAM drive will perform.

    Quote Originally Posted by MiRai
    On another note, RAM frequency controls how fast your RAM drive is although I'm not sure the price you pay for higher frequency RAM is really worth the speed increase it brings:
    http://www.dual-boxing.com/threads/4...l=1#post367552

  7. #7

    Default

    Ah ok that makes sense.

  8. #8

    Default ugh.

    AMD confirmed that 8 series won't be out til Q4 @ earliest. Wonder if NVIDIA will let their launch slide as well.

  9. #9

    Lightbulb so .

    Exactly as you predicted, the titan is about 90% a 690. We will see for sure once reviews are released, but judging by specs that looks right. Question is now: if a titan I'd 40% faster than a 680, will that translate well for us boxers or will I just be CPU limited anyways?

Posting Rules

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •