These forums have been archived and are now read-only.

The new forums are live and can be found at https://forums.eveonline.com/

EVE General Discussion

 
  • Topic is locked indefinitely.
12Next page
 

Incarna Video Card Performance Fix... for some at least!

Author
neamiah
The Scope
Gallente Federation
#1 - 2011-11-18 03:58:51 UTC  |  Edited by: neamiah
After an hour of searching forums and blogs about the noisy fans and hot video cards since Incarna's incarnation, I tooled around with every setting one-by-one concerning graphics on the menu. The only thing that put the fans on the GPU at a tolerable db was to change the Present Interval to default. Anything lower would seem to drag.

I love how fast the frame rates are when using the mouse to change views in station (spinning) or in space during combat or mining ops. But default is certainly tolerable and totally changed the noise and temp within about 30 seconds of changing the setting (it took about a minute for the fan noise to level out to about 40%.

I run the game at 1900x1200 on each monitor running two clients or 3840x1200 using EVEMover across both monitors using a single client. It's pain to get set up, but it works great and worth the trouble. Just glad to be playing that way in a quieter room!

So here's my care rig spec - minus the fluff - for reference:

  • Asus P6T Deluxe V2
  • Windows 7 Ultimate x64
  • Core i7 920 (OC @ 3.3GHz)
  • 12GB Corsair XMS3 @ 1600MHz
  • NVIDIA GeForce GTX 470 2x SLI
  • 120GB SSD RAID 0
  • Corsair Graphite Series 600T Case

  • Please ping back and tell me the results. Better yet, if you have another solution!! Cool
    neamiah
    The Scope
    Gallente Federation
    #2 - 2011-11-18 04:23:41 UTC
    I realized that this should be in the Issues section. Feel free to remove this post if necessary.
    supersexysucker
    Uber Awesome Fantastico Awesomeness Group
    #3 - 2011-11-18 04:39:56 UTC
    called a custom fan profile or better cooling I have a 470... she never gets lould... and i have my fan profile set to keep it cooler than stock.

    I mean of course caping the FPS is gona lower the GPU usage so heat....
    Kietay Ayari
    Caldari State
    #4 - 2011-11-18 05:35:26 UTC
    Get an air conditioner for the room you are in

    Ferox #1

    Denidil
    Cascades Mountain Operatives
    #5 - 2011-11-18 05:52:25 UTC
    stop buying silicon from companies that have been known to have severe TDP management issues for more than a decade.

    Tedium and difficulty are not the same thing, if you don't realize this then STFU about game design.

    Hungry Eyes
    Imperial Academy
    Amarr Empire
    #6 - 2011-11-18 06:55:32 UTC
    neamiah wrote:
    After an hour of searching forums and blogs about the noisy fans and hot video cards since Incarna's incarnation, I tooled around with every setting one-by-one concerning graphics on the menu. The only thing that put the fans on the GPU at a tolerable db was to change the Present Interval to default. Anything lower would seem to drag.

    ]



    yea i was happy when i figured this out too. interval immediate simply made my card overheat, at the expense of some really sweet fps. (150-200). capping your frames prevents overheating.
    neamiah
    The Scope
    Gallente Federation
    #7 - 2011-11-18 16:38:44 UTC  |  Edited by: neamiah
    Hungry Eyes wrote:
    neamiah wrote:
    After an hour of searching forums and blogs about the noisy fans and hot video cards since Incarna's incarnation, I tooled around with every setting one-by-one concerning graphics on the menu. The only thing that put the fans on the GPU at a tolerable db was to change the Present Interval to default. Anything lower would seem to drag.

    ]



    yea i was happy when i figured this out too. interval immediate simply made my card overheat, at the expense of some really sweet fps. (150-200). capping your frames prevents overheating.



    I find it interesting that one out of maybe 4 or 5 replies in these forums are even useful. Go troll the Beiber forums where you belong... buncha nubbz!!

    Thanks at least for intelligent reply HE.

    The kids out there need to understand that just because "their" junk works OK in a given situation, doesn't mean they have the solution to everything. Especially by saying stupid things like "get an air conditioner in the room". I wish I were 16 again... or 25 for that matter...
    Solstice Project
    Sebiestor Tribe
    Minmatar Republic
    #8 - 2011-11-18 16:51:16 UTC
    Just here to clarify, if somebody is interested.

    It's called vertical synchronization and is actually a relict from old CRT-displays.

    Basically, the ray was building the screen from the topleft to the bottom right,
    going line by line ... and at the last line, the last pixel,
    it did a shift back up to the top left.

    Waiting for this event to happen and then drawing the frame prevented tearing and some artifacts.

    Nowadays it's just "the moment between two frames", but can be used for the same effect.

    You're not actually capping your framerate,
    but instead synchronizes your framerate with your display.


    JSYK.
    Solstice Project
    Sebiestor Tribe
    Minmatar Republic
    #9 - 2011-11-18 16:59:01 UTC

    Other things you can try is setting the CPU affinity of exefile.exe in your taskmangar
    to one cpu only ... best to use CPU 0, because in a hyperthreaded system setting it
    to one of the virtual CPUs gives a bad result. ^^

    EvE isn't really good at multithreading (read: none) and i've noticed an actual
    slowdown when NOT setting the affinity ... windows task management has it's issues here.

    So this puts a bit less strain on your CPU and results in a slightly better performance,
    but i don't know if you'd notice anyway.

    btw ... your "default" sounds like you have vsync activated in your gfx settings,
    i believe the same result can be achieved by setting the interval to one ?
    Just saying, in case something changes one day.

    What you ALSO can do is, in your gfx settings, use custom settings and override the application ones ...
    and then set them to specific needs, like a specific low-cost-AA or something.
    At your resolution, for example, i believe 4x is already sufficient, maybe even 2x ... still much less work to do.

    Please note that you have to have AA enabled in your escape menu in EvE,
    even when you override application settings in your gfx driver,
    else it won't work at all or give you borked results.

    Hope it helps. Turning down AA would be my number one option,
    considering your resolution.

    Lord Ryan
    True Xero
    #10 - 2011-11-18 17:03:59 UTC
    Kietay Ayari wrote:
    Get an air conditioner for the room you are in


    ^My solution^

    I have central air, but the office is on the 2nd floor with a large window facing tthe sun most of the day. I bought a window unit, but after a couple weks I got a leeter from the home owners association telling me to remove it. So I bought this. Around $300 dollars.

    *Disclaimer* Not current setup.

    Do not assume anything above this line was typed by me. Nerf the Truth, it's inconvenient.

    Opertone
    State War Academy
    Caldari State
    #11 - 2011-11-18 17:06:08 UTC
    Dump Nvidia, Mkay? Regardless of advertising, their cards - still crap.

    Or keep on and play on Xbox and playstation - they have 'marvelous' video chips. Next gen tertris handheld.

    Fan issues - poor card design, bad casing, bad overall cooling. Bad drivers, motherboard conflicts. I've through that all myself. My Nvidia monster card back at the day had a useless fan, got stuffed with dust, dust burnt, fan failed, card failed and short circuited. Card design is to blame, hey but at least you have to buy new ones more often!

    This post sums up why the 'best' work with DCM inc.

    WARP DRIVE makes eve boring

    really - add warping align time 300% on gun aggression and eve becomes great again

    Lord Ryan
    True Xero
    #12 - 2011-11-18 17:06:59 UTC
    Lord Ryan wrote:
    Kietay Ayari wrote:
    Get an air conditioner for the room you are in


    ^My solution^

    I have central air, but the office is on the 2nd floor with a large window facing tthe sun most of the day. I bought a window unit, but after a couple weks I got a leeter from the home owners association telling me to remove it. So I bought this. Around $300 dollars.

    *Disclaimer* Not current setup.



    *Edit* Search LG Electronics 9,000 BTU Portable Air Conditioner with Dehumidifier and Remote since spacebook doesn't like links.

    Do not assume anything above this line was typed by me. Nerf the Truth, it's inconvenient.

    Opertone
    State War Academy
    Caldari State
    #13 - 2011-11-18 17:12:49 UTC
    Lord Ryan wrote:
    Kietay Ayari wrote:
    Get an air conditioner for the room you are in


    ^My solution^

    I have central air, but the office is on the 2nd floor with a large window facing tthe sun most of the day. I bought a window unit, but after a couple weks I got a leeter from the home owners association telling me to remove it. So I bought this. Around $300 dollars.

    *Disclaimer* Not current setup.



    sweet setup, it feels like it is dedicated to gaming

    This post sums up why the 'best' work with DCM inc.

    WARP DRIVE makes eve boring

    really - add warping align time 300% on gun aggression and eve becomes great again

    RaTTuS
    BIG
    #14 - 2011-11-18 17:17:58 UTC
    Syncing the frame rate is the way to go - as eve does not need >60fps - it's limited not by your twitch reflexes but the server.
    I've had great success at setting the affinity of the CPU to the last one in the system
    Nvidia make great GPU's for gamers
    ATI make great GPU's for other use.
    YMMV

    http://eveboard.com/ub/419190933-134.png http://i.imgur.com/kYLoKrM.png

    Denidil
    Cascades Mountain Operatives
    #15 - 2011-11-18 17:49:39 UTC  |  Edited by: Denidil
    RaTTuS wrote:
    Syncing the frame rate is the way to go - as eve does not need >60fps - it's limited not by your twitch reflexes but the server.
    I've had great success at setting the affinity of the CPU to the last one in the system
    Nvidia make great GPU's for gamers
    ATI make great GPU's for other use.
    YMMV


    as someone who has written graphic engines before... LOL

    nVidia silicon is utter and complete garbage, i got sick and tired of coding around their inability to properly implement OpenGL/Direct3D in software, or various algorithms correctly in silicon. not to mention their completely inability to manage the thermal design of their cards. they've NEVER known their posterior from a hole in the ground when it came to TDP management.

    ATI cards on the other hand have always had a pure horsepower advantage since the tail end of the Rage series, and have always properly managed their thermal output (with one notable exception: the HD 2xxx series, which they admitted straight out they f'ed up). Now, the reason you see nvidia cards sporting 3-5 more FPS is ATI spends this extra computing power on image quality instead of "MOAR FPS! BETTAR FPS!" on default settings.*


    let's go with the highest single chip designs from the last few generations - at factory clock rates

    i'm clustering them by generation - backwards from current

    Radeon HD6970 - 2703 GFLOPS
    Geforce GTX 580 - 1581.1 GFLOPS

    Radeon HD5870 - 2720 GFLOPS (some of that processing power wasted in VLIW5 design)
    Geforce GTC 480 - 1088.64 GFLOPS

    Radeon HD4890 - 1360 GFLOPS (VLIW5 again)
    Geforce GTX 285 - 1062.720 GFLOPS


    the new Radeon HD7970 is rumored to have 4096 GFLOPS processing power, at only 190W TDP - that's 21.5 GFLOPS/watt, double the 6970's efficiency, or 4x the thermal efficiency of the GTX580


    [edit]
    * i would point out that past a certain point differences in FPS make less of a difference in actual eye perception (ie it's hard for the eye to really distinguish between 24 and 60 fps - it can with difficutly, beyond 60 it really cannot). if your physics engine uses FPS as it's tick rate then you can get some difference from that - because physics engines act like a Riemann sum with FPS=n. the human eye is incredibly good at picking out errors in arcs - 1 arcsecond error perception - however very few things in most games are going to trigger that perception

    Tedium and difficulty are not the same thing, if you don't realize this then STFU about game design.

    Solstice Project
    Sebiestor Tribe
    Minmatar Republic
    #16 - 2011-11-18 17:52:53 UTC
    RaTTuS wrote:
    I've had great success at setting the affinity of the CPU to the last one in the system



    Thanks for letting me know.

    Sad part is, it has to be done each and every time anew,
    but i believe there are programs who do that automagically too.
    Solstice Project
    Sebiestor Tribe
    Minmatar Republic
    #17 - 2011-11-18 17:54:16 UTC
    Denidil wrote:
    Radeon HD6970 - 2703 GFLOPS
    Geforce GTX 580 - 1581.1 GFLOPS

    Radeon HD5870 - 2720 GFLOPS (some of that processing power wasted in VLIW5 design)
    Geforce GTC 480 - 1088.64 GFLOPS

    Radeon HD4890 - 1360 GFLOPS (VLIW5 again)
    Geforce GTX 285 - 1062.720 GFLOPS


    the new Radeon HD7970 is rumored to have 4096 GFLOPS processing power, at only 190W TDP - that's 21.5 GFLOPS/watt, double the 6970's efficiency, or 4x the thermal efficiency of the GTX580


    No offense, but ... ATI fanboy spotted. ^^
    Denidil
    Cascades Mountain Operatives
    #18 - 2011-11-18 17:56:41 UTC
    Solstice Project wrote:


    No offense, but ... ATI fanboy spotted. ^^


    you mean "software engineer that has worked on games spotted." one that hasn't accepted money from nVidia anyway.

    Tedium and difficulty are not the same thing, if you don't realize this then STFU about game design.

    Grukni
    Caldari Provisions
    Caldari State
    #19 - 2011-11-18 18:11:25 UTC
    Denidil wrote:
    RaTTuS wrote:
    Syncing the frame rate is the way to go - as eve does not need >60fps - it's limited not by your twitch reflexes but the server.
    I've had great success at setting the affinity of the CPU to the last one in the system
    Nvidia make great GPU's for gamers
    ATI make great GPU's for other use.
    YMMV


    as someone who has written graphic engines before... LOL

    nVidia silicon is utter and complete garbage, i got sick and tired of coding around their inability to properly implement OpenGL/Direct3D in software, or various algorithms correctly in silicon. not to mention their completely inability to manage the thermal design of their cards. they've NEVER known their posterior from a hole in the ground when it came to TDP management.

    ATI cards on the other hand have always had a pure horsepower advantage since the tail end of the Rage series, and have always properly managed their thermal output (with one notable exception: the HD 2xxx series, which they admitted straight out they f'ed up). Now, the reason you see nvidia cards sporting 3-5 more FPS is ATI spends this extra computing power on image quality instead of "MOAR FPS! BETTAR FPS!" on default settings.*


    let's go with the highest single chip designs from the last few generations - at factory clock rates

    i'm clustering them by generation - backwards from current

    Radeon HD6970 - 2703 GFLOPS
    Geforce GTX 580 - 1581.1 GFLOPS

    Radeon HD5870 - 2720 GFLOPS (some of that processing power wasted in VLIW5 design)
    Geforce GTC 480 - 1088.64 GFLOPS

    Radeon HD4890 - 1360 GFLOPS (VLIW5 again)
    Geforce GTX 285 - 1062.720 GFLOPS


    the new Radeon HD7970 is rumored to have 4096 GFLOPS processing power, at only 190W TDP - that's 21.5 GFLOPS/watt, double the 6970's efficiency, or 4x the thermal efficiency of the GTX580


    [edit]
    * i would point out that past a certain point differences in FPS make less of a difference in actual eye perception (ie it's hard for the eye to really distinguish between 24 and 60 fps - it can with difficutly, beyond 60 it really cannot). if your physics engine uses FPS as it's tick rate then you can get some difference from that - because physics engines act like a Riemann sum with FPS=n. the human eye is incredibly good at picking out errors in arcs - 1 arcsecond error perception - however very few things in most games are going to trigger that perception


    Too bad CCP codes for Nvidia. They even implement things that don't work in ATI cards.
    Karadion
    Caldari Provisions
    Caldari State
    #20 - 2011-11-18 18:14:20 UTC
    Here's a simple fix.

    Turn that crap off.
    12Next page