These forums have been archived and are now read-only.

The new forums are live and can be found at https://forums.eveonline.com/

EVE General Discussion

 
  • Topic is locked indefinitely.
Previous page12
 

Incarna Video Card Performance Fix... for some at least!

Author
Denidil
Cascades Mountain Operatives
#21 - 2011-11-18 18:25:31 UTC
Grukni wrote:


Too bad CCP codes for Nvidia. They even implement things that don't work in ATI cards.


um, they still use the standard APIs, even if they're accepted nVidia bribes. what have they coded that doesn't work on both?

Tedium and difficulty are not the same thing, if you don't realize this then STFU about game design.

neamiah
The Scope
Gallente Federation
#22 - 2011-11-18 19:32:49 UTC
Solstice Project wrote:

Other things you can try is setting the CPU affinity of exefile.exe in your taskmangar
to one cpu only ... best to use CPU 0, because in a hyperthreaded system setting it
to one of the virtual CPUs gives a bad result. ^^

EvE isn't really good at multithreading (read: none) and i've noticed an actual
slowdown when NOT setting the affinity ... windows task management has it's issues here.

So this puts a bit less strain on your CPU and results in a slightly better performance,
but i don't know if you'd notice anyway.

btw ... your "default" sounds like you have vsync activated in your gfx settings,
i believe the same result can be achieved by setting the interval to one ?
Just saying, in case something changes one day.

What you ALSO can do is, in your gfx settings, use custom settings and override the application ones ...
and then set them to specific needs, like a specific low-cost-AA or something.
At your resolution, for example, i believe 4x is already sufficient, maybe even 2x ... still much less work to do.

Please note that you have to have AA enabled in your escape menu in EvE,
even when you override application settings in your gfx driver,
else it won't work at all or give you borked results.

Hope it helps. Turning down AA would be my number one option,
considering your resolution.



All good thoughts above.... I will make some adjustments based on your recommendations. I looked at the global nVidia settings, and the setting specific to EvE and vsync is not forced. I actually thought that I may have forced it because I used to in the past. I went ahead and set the global setting to default as well, just for troubleshooting for now.

Just to clear some things up here, though I do live in FL, the room temp is around 72F and the front of the computer sits right under the vent. This is the only game that seems to cause such a problem with GPU heat. Crysis 2 engine causes the fans to kick up about 60%. I know it's a different engine but still... since Incarna, some of us with newer systems seem to have a combination of hardware that the EvE graphics engine is not happy with.

This has nothing to do with nVidia vs' ATi/AMD... period. Developing at this level is a whole different ball game and you can't please every singe hardware combination/configuration. I have been building and configuring servers, workstations, and gaming PC's for almost 12 years and this is just part of the game.

So with that said, thanks to those who are offering a solution to the computing problem here.
Opertone
State War Academy
Caldari State
#23 - 2011-11-18 19:45:41 UTC
Nuff said

Article article

Ati can support 3 displays, and more, Nvidia can not do it. Weaker, hotter, less power/optimization. More Hype and Adds.

This post sums up why the 'best' work with DCM inc.

WARP DRIVE makes eve boring

really - add warping align time 300% on gun aggression and eve becomes great again

Opertone
State War Academy
Caldari State
#24 - 2011-11-18 19:48:09 UTC
neamiah wrote:
[quote=Solstice Project]


This has nothing to do with nVidia vs' ATi/AMD... period. Developing at this level is a whole different ball game and you can't please every singe hardware combination/configuration. I have been building and configuring servers, workstations, and gaming PC's for almost 12 years and this is just part of the game.



No. You silly bunny. It's the card design. And your poor cooling. Don't object to the laws reality 'with I am ex-marine, I karate kid, I build computers'.

This post sums up why the 'best' work with DCM inc.

WARP DRIVE makes eve boring

really - add warping align time 300% on gun aggression and eve becomes great again

Razin
The Scope
#25 - 2011-11-18 19:52:16 UTC  |  Edited by: Razin
Denidil wrote:


i'm clustering them by generation - backwards from current

Radeon HD6970 - 2703 GFLOPS
Geforce GTX 580 - 1581.1 GFLOPS


That's nice. Could you link a real world benchmark (i.e. a game) that shows how important those numbers are?
neamiah
The Scope
Gallente Federation
#26 - 2011-11-18 20:09:57 UTC
Opertone wrote:
neamiah wrote:
[quote=Solstice Project]


This has nothing to do with nVidia vs' ATi/AMD... period. Developing at this level is a whole different ball game and you can't please every singe hardware combination/configuration. I have been building and configuring servers, workstations, and gaming PC's for almost 12 years and this is just part of the game.



No. You silly bunny. It's the card design. And your poor cooling. Don't object to the laws reality 'with I am ex-marine, I karate kid, I build computers'.



Yet another fan-boy goof with creative put-downs! Good grief... at least it's not as bad as WOW players.
Opertone
State War Academy
Caldari State
#27 - 2011-11-18 21:22:13 UTC
owned 4 ati cards, 3 nvidia cards

2 nvidia cards burnt - can't add nothing.

Go ride the HYPE train and pay more for the BAND wagon front seat.

Heat issues - cooling. Fan too loud - crappy fan :P, not enough cooling.

This post sums up why the 'best' work with DCM inc.

WARP DRIVE makes eve boring

really - add warping align time 300% on gun aggression and eve becomes great again

neamiah
The Scope
Gallente Federation
#28 - 2011-11-18 22:56:29 UTC
Opertone wrote:
owned 4 ati cards, 3 nvidia cards

2 nvidia cards burnt - can't add nothing.

Go ride the HYPE train and pay more for the BAND wagon front seat.

Heat issues - cooling. Fan too loud - crappy fan :P, not enough cooling.



Thanks for the free bump! :D
Vagilicious
Center for Advanced Studies
Gallente Federation
#29 - 2011-11-18 23:05:43 UTC
Karadion wrote:
Here's a simple fix.

Turn that crap off.


*yawn*

Go back to last month, troll. You and your ilk have had their say. I'm tired of hearing it.
Krixtal Icefluxor
INLAND EMPIRE Galactic
#30 - 2011-11-18 23:05:49 UTC  |  Edited by: Krixtal Icefluxor
I am NOT going to comment on any of the technical aspects of what is going on here.

It is fantastic that CCP is actually 'fixing' parts of the 1,000 Papercuts, and all that, but now the technical side is being UTTERLY ignored.

YES,,,,HEAT is a HUGE issue !

Work on it CCP. For God's sake PLEASE.

It's STUPID !! And unprofessional.

Learn to code....................

"He has mounted his hind-legs, and blown crass vapidities through the bowel of his neck."  - Ambrose Bierce on Oscar Wilde's Lecture in San Francisco 1882

Zagdul
Federal Navy Academy
Gallente Federation
#31 - 2011-11-19 07:50:42 UTC  |  Edited by: Zagdul
Opertone wrote:
Nuff said

Article article

Ati can support 3 displays, and more, Nvidia can not do it. Weaker, hotter, less power/optimization. More Hype and Adds.



uh...

that article says clearly:
Quote:
"Neither card can boast overwhelming advantage over the other. "



In the conclusion the reviewer further goes on to say:
Quote:

"It looks like neither of the two new dual-processor graphics cards can claim absolute superiority in games. The Radeon HD 6990 and the GeForce GTX 590 won the same number of my tests and delivered the same performance in the rest of them. We have this equilibrium for two simple but important reasons."


Aaannnd

Quote:
Comparing the new products from other aspects, the GeForce GTX 590 looks somewhat preferable to the Radeon HD 6990 because it is smaller, has a lower temperature of the GPUs (by 10-12°C) and a quieter cooler. But “quieter” doesn’t mean that it’s really quiet because each of these cards is rather noisy in 3D mode. I could not measure the power consumption of the cards for this review but this issue will be covered in our upcoming articles. Stay tuned.


Same price for the two cards, the ATI one takes up more room so if you get it make sure it fits. Not to mention that the ATI card requires a larger PSU to run properly.

That's the difference bud.

Actually, if you read, Nvidia clocked the new GPU down in order to help against heat. If you were to throw some water cooling on that thing and clock it up to the 580 speeds, it'd rip that AMD card apart.

just sayin' nVidia played it safe and still is competing.

Dual Pane idea: Click!

CCP Please Implement

Kietay Ayari
Caldari State
#32 - 2011-11-19 08:13:37 UTC
Video cards do not break themselves. Either it is a bad card or you do not take care of your computer. A game cannot tell a video card to do something that will break it. It is just impossible. D: please don't start a fire.

No possible way this can be CCP's fault :D

As for the lag I am not sure what people are talking about :| I was using a 9800 GTX before and could run CQ fine. That card is like 3 years old now so o_O

Ferox #1

yer mammy
Federal Defense Union
Gallente Federation
#33 - 2011-11-19 08:48:04 UTC
sadly whatever advantage ati cards have in hardware is always held back by their horrible software.
Zagdul
Federal Navy Academy
Gallente Federation
#34 - 2011-11-19 09:58:05 UTC
yer mammy wrote:
sadly whatever advantage ati cards have in hardware is always held back by their horrible software.


These days it doesn't matter much anyway.

AMD/INTEL/ATI/Nvidia - You can get a comparable CPU/GPU with either that does the same thing. Intel is usually stronger per core, AMD is usually better at multi tasking. One will gain you 4 extra FPS in crysis on max res with super duper ultra quality for $300 more.



moore's law has just about run it's course.

Dual Pane idea: Click!

CCP Please Implement

Chelone
Outside The Asylum
#35 - 2011-11-19 13:53:09 UTC
Pretty sure the only point of this thread was so the OP would have an excuse to stick his system specs in our face.
neamiah
The Scope
Gallente Federation
#36 - 2011-11-20 01:34:15 UTC  |  Edited by: neamiah
Chelone wrote:
Pretty sure the only point of this thread was so the OP would have an excuse to stick his system specs in our face.



Um... wow... seriously?

It's just too bad that only 2 people on this thread had any truly useful information aside from belittling someone in the thread. I'm sorry some of you are so lonely and self centered.

Anyway, I figured out that What I posted above with setting the nVidia control panel setting to default did the trick.

*** Except for the fact that CQ still runs the card super hot. That is just some engine adjustments in the game. ***

Hopefully, CCP will have mercy for us on that issue... Anyone seem to solve the problem with CQ??
Tanya Powers
Doomheim
#37 - 2011-11-20 01:44:56 UTC
Wooosh, humm not sure I'll ever help you with my answer but please take some time to think about it.

First of all, you live in a fracking world where everyone is the best when you have the best of...UNTRUE !!

Think about upgrade your stuff, drivers, memory, fan cleaning or add fun for better heat dissipation, your graphic card can do a very long road and you will hug your little fracking planet by increasing your graphic card life, better not buying another new hard core best of newest fracking shift latest stuff on the market that will be obsolete at the single moment you just buy it.

Choose something that is known by long term investment -improvements by drivers whatsoever- than something "RIGHT NAO"

Choose something respecting value of "salvage" old stuff, respecting environment, respecting your health, respecting energy consumption and you'll have something really interesting offering you awesome effects for nothing really expensive.

Be responsible, everything you'll get on acting alike will be good.

I'm out Lol
Corina Jarr
en Welle Shipping Inc.
#38 - 2011-11-20 01:57:42 UTC
Don't know if it is an issue for the OP, but it may help others who have heating and fan noise issues.
Nvidia drivers have had a problem since v180 where the automatic fan control sometimes doesn't kick in until 70-80C. At which point the fan goes up really high to try to cool down the card.

Try using EVGA precision, as that can override the driver control of the fan (and actually works unlike Nvidia's control panel).
neamiah
The Scope
Gallente Federation
#39 - 2011-11-20 02:17:38 UTC
Cool.... thanks for the last two thoughtful comments. Hard to come by around here...

Just cause the rig has some nice stuff in it doesn't make it latest greatest. The GTX470's have been out for a while now, over a year I believe. Again, this thing runs everything I throw at it they way it's configured, and runs it fairly quietly. Especially since the 285.62 driver was released. I just think CCP are still tweaking some things and certainly nVidia or ATi could care less.

CJ, I use MSI Afterburner for that very reason. And the fact that the profile feature actually works. But you may be on to something in the area of fan control via software. I'm gonna mess with that and see what I come up with.

Thanks for that!

Keep em coming...
Previous page12