|
Post by Kokusho the Evening Star on Sept 8, 2010 3:04:18 GMT -5
Since I did not study Computer Programming through college, I have limited knowledge in them, so I better not tweak them and be happy the things I got. Anyway, I have to be careful of what option to choose, because go all the way to performance will cost some quality on images, particularly the Left 4 Dead 2 movie poster images become so pixelated badly.
|
|
|
Post by The Italian Dragon on Sept 8, 2010 7:58:06 GMT -5
WellKokusho, I don't know much in it as well but I learn it all by myself and well it gives some quite nice results
|
|
Clone
Maturing Dragon
That one dragon with no name
Posts: 2,243
|
Post by Clone on Sept 8, 2010 12:29:45 GMT -5
well a good rule of thumb is the lower the textures the worse a game looks. but the lower the polygons (model detail lvl) can be decreased to minimum with out loosing much in visual quality. for example, Crysis with max textures but minimum poly will run much better than both on max but will not loos as much in visual quality as if you lowered the textures. on my laptop, i run everything minimum but leave the textures up high and i get some really good results.
the best way is to just mess with the settings and see if you like the results. (its how i teach most of my stuff, i only look up something like 20% of my knowledge)
|
|
|
Post by Kokusho the Evening Star on Sept 8, 2010 13:29:36 GMT -5
Hmm, good point for using lower polygons though. That's nice, but I thought more polygons means smoother models. And yes, I remember Starcraft 2 has an option of Model Quality, it may not say anything about polygons, but it's that one of them? I wish the both of you at least give Starcraft 2 a try, that game revive the RTS genre while the others could not. Once you play that game, even if you don't, at least you could help to explore the Settings section.
|
|
|
Post by The Italian Dragon on Sept 8, 2010 17:44:39 GMT -5
The polygon option can be a good idea for slower systems and stuff
|
|
|
Post by Kokusho the Evening Star on Sept 9, 2010 6:00:29 GMT -5
Mine should be sophisticated, but oh well, too bad that my graphic card normally ain't powerful to run on laptops compare to on desktops.
|
|
Clone
Maturing Dragon
That one dragon with no name
Posts: 2,243
|
Post by Clone on Sept 9, 2010 8:29:41 GMT -5
yes that's to correct option, the thing when it comes to visuals is to think of it like this. Textures First, Particles second, Polygons 3rd, and filtering 4th.
first thing to go if your on a slow system, Filtering (AA, AF). second thing if your speeds still not good, the Polygon density (model detail, draw distance). still not fast enough? drop particles (normally you don't need to go really low on these though, this is just if your slowdowns are occurring during action still). still not working? (you need a better card) drop the textures (this will result in the worst, largest, decrease in visual quality, and that's why its last)
also if you look at it from a processing stand point, the average GPU can process parc loads of more texture points than it can polygons (shaders fall in between texture and polygons). its only of the DX11 generation of GPU's that polygon processing power has actual made a jump forward. (the ATI 5xxx has more shader power and texture power than the Nvidia 4XX cards but its polygon power is something close to 40% less than the 4xx Nvidia cards, why Nvidia decided to drop some texture units i will never know)
|
|
|
Post by Kokusho the Evening Star on Sept 9, 2010 14:27:10 GMT -5
I also remember the last time when one of you mention that the Geforce 8800 or 9800 could beat most modern ATI cards, here's a little graphic scenario, when you finish playing Left 4 Dead 2's The Parish, when you see the bridge explode, those 2 cards work like a charm as the explosion plays smoothly. The cards I bet they're played from desktops, mine on the other hand, it's the HD 5470, it's supposed to play smoothly as well, but unfortunately, it isn't, it still lags ONLY in that scene, and yeap, it's played on the laptop instead of a desktop. I want to ask a simple math here, if you compare my HD 5470 side by side with both the same model numbers except one is made for laptop and the other is desktop, which side is more powerful? The obvious answer is the desktop, but I need to know how much power performance has been reduced from the desktop version. Desktop version : 100% full performance Laptop version : ?? % performance lost PS : Speaking of Left 4 Dead 2, my settings are all max settings except the resolution is tone down from 1920x1080, it dropped to 1600x900, it works fine but except at the Bridge explosion, it lags. (Left 4 Dead is all max settings but the resolution is at 1280x720, gameplay is super smooth, it even work well with a triple buffered VSync & Multicore Rendering set to Enabled) L4D2 on the other hand, I have everything else went high but with 1600x900 resolution, I'm now tweaking everything down to the same as the 1st game and see will I replicate the result, if not I might have to turn off VSync & Multicore Rendering. EDIT #2 : Ok, looks like the bridge explosion is getting smoother and only with little lag after I turn the VSync to "Disabled" as well as turning off "Multicore Rendering", by the way, have you consider playing around with the "Paged Pool Memory Available", my current setting for this one is "High". So what are your recommendations? My other settings that might also take up some smoothness are anti-aliasing set to the max as "4X MSAA" & filtering "Anisotropic 16X". Don't forget that my new resolution is now 1280x720, this smaller resolution will hopefully keep the lagging at bay. Well? If you need extra guidance on each of the game's settings, here's the guide I found from Garena. But I got the feeling most of the lagging happens is when the situation is really intense, particularly and mostly obviously in The Parish's Bridge finale. gaming-extreme.info/index.php?/topic/2-left-4-dead-2-nonsteam-guide-updates-and-patches/
|
|
Clone
Maturing Dragon
That one dragon with no name
Posts: 2,243
|
Post by Clone on Sept 9, 2010 19:45:33 GMT -5
first off, just disable AA and crank your resolution up to your native (1080) and keep Vsync off. (vsync will decrease your frame rate if it cant get 60fps, it will then drop your rate to 30fps and so on) AA is unnecessary at 1080p. while its nice to play it with it, its not really going to improve your experience like it would on a 480p or even a 720p monitor/tv. keep the page poll at maximum, they will tell it that its got lots of room to mess with and will result with the best performance. smaller resolutions will help with lag but only if your AA'ing an image OR are running out of video memory (like i was on my 8800GT with Crysis). so if you disable AA (which i suggest) you should loos most lag (to all) and even be able to up the resolution that you play at (there's nothing like running a game at your screens native resolution) corections..... according to what ive read on performance charts of your current GPU, under its normally GDDR3 your performance in L4D should be around 4 with that selected resolution with out AA and AF. (you can turn on AF any ways though because its not to taxiing on the GPU) my suggestion on resolutions should be taken lightly because i don't know the speed of your VRAM and your VRAM quantity. if you could tell me those i could give you better suggestions. (use GPUZ if you don't know)
|
|
|
Post by Kokusho the Evening Star on Sept 10, 2010 3:43:01 GMT -5
I have problems understanding the functions of a graphic card as I know processors more. Anyway, is there a big difference with the AA off & on. I know 3D, if you left it off, it'll make your model look more pixelated and jagged around the edges, so I keep it on just to look pretty. Although I'm not really sure. Anyway, my L4D settings can keep it this way since I'm running without any lagging problems, The Sacrifice DLC is all I need to wait and see if it's going to lag my L4D, if it does, I'll turn off VSync. It's now making me wonder, why the hell people create Vertical Sync when it's actually more harm than good? People thought turning it on will solve the problem but it actually creates more, so for the true gamer's mind, usually they turn it off.
|
|
Clone
Maturing Dragon
That one dragon with no name
Posts: 2,243
|
Post by Clone on Sept 10, 2010 10:59:38 GMT -5
just turn on Vsync any ways, its inhibiting your game play. it should be locking you to just 30FPS, if your only rendering something in the 40's range. i didn't ever use Vsync (Halo and Halo 2 were exceptions) under my 8800 GT because with it on, my game play would just die some times in heavy situations. with my new card i average something like 100FPS and my monitor has problems with that man frames (i get tearing). that's the only reason why i turned it on. with your card, leave Vsync off and you should have a much smoother game play experience (give it a try and let me know if you dip below 30fps). now as my suggestion all ways go, turn off AA and go to your native resolution. it image will actually look better. pleas download GPU-Z and just upload an image of what it reads out for that i can better suggest things to you. LINK
|
|
|
Post by Kokusho the Evening Star on Sept 10, 2010 15:41:40 GMT -5
Very well, here you go :
|
|
Clone
Maturing Dragon
That one dragon with no name
Posts: 2,243
|
Post by Clone on Sept 11, 2010 1:20:09 GMT -5
well sadly your Gigapixel speed and bandwidth is less... (i was suspecting so)
you may want to just turn off AA in total. you will get the best gaming experience form a smooth frame rate rather than an AA'ed slide show. to be frank, the 8800Gt got 10.4GP's and a buss of 60.8GB/s had some glitches in frame rates in L4D with AA enabled. (frame rate would occasionally drop below 30, but i suspect this to be a result of either CPU or Game Engine problems at the time)
the good news though is that you GPU is well balanced from the performance of its parts stand point. my Laptop is vary Bandwidth choked, and the GTX cards are choked in DX9 and DX10 games from its decreased Texturing units from the previous generation. (however in polygon performance the GTX 4xx cards are stupid fast when compared to the 2xx cards)
i would assume that optimal would be with no AA, max textures and postprocesing along with a resolution around the 720p standard. (obviously no Vsync) however you could try running it at higher resolutions and see what happens. (the lower resolution would be a result of the lower bandwidth of your memory and no AA a result of the low processing power of the core)
in the end be glad that you've got GDDR3; in the specifications, it said your card could be outfitted with DDR2, that's not even GDDR2 grade! that's what killed my Laptops performance was its slow VRAM of GDDR2. so just think about how limited you would be if that was the case for you! (low, low resolution any one?)
|
|
|
Post by Kokusho the Evening Star on Sept 11, 2010 8:51:58 GMT -5
Ahh, I see. Well, at least things are going fine, hopefully this forum better not die since we may turn back to here for any technical discussions.
Anyway, normally your 8800GT is impossible to perform such high end stuff compare to more modern models. Let's say your 8800GT VS a 9000 series card, normally it's the 9000 series card should be higher end, or even my card is way better than yours and everything in max settings should not be a problem.
Yeah, whatever, sometimes I just don't fully understand graphic cards.
|
|
Clone
Maturing Dragon
That one dragon with no name
Posts: 2,243
|
Post by Clone on Sept 11, 2010 12:46:30 GMT -5
yes normally a newer card is faster however, the better standard to measure performance buy is actual in game frame rates or by the numbers. for example despite the older core my 8800GT still can process more information per clock than yours and is thus faster, however if you do a test of how much AF and AA bog down the card your 5xxx class card will be the winner. however the huge bandwidth advantage my 8800GT has over yours will allow it to run most games at any resolution it desires. where your skimming the line for 720p resolutions on some games. Notebook check has some numbers on your card that should interest you, with out AA AF and vsync they got a stable frame rate over 30FPS for L4D. www.notebookcheck.net/ATI-Mobility-Radeon-HD-5470.23698.0.htmli've been doing to testing on my Laptop but so far the data inconclusive with Counter Strike on how much the resolution change effects frame rates. some games i've ran on it can only run at low resolutions while others show no hinderence at running at different resolutions. (CS shows a -20 FPS under identical settings just a difference in resolution. but even at that the benchmark system is still giving me 40+ FPS where in game i'm skimming 30 with all settings on low) ----new---- well this test is even mroe inconclusive than the last. this one was ran using Crysis Demo 64bit on my desktop and it actually showed increased performance from going up in resolution rather than down. i can only assume this is a result of 64bit processing limitations. (AA=1, AF=1, All settings Max quality including drivers) ------------Min--------Max------Average 800x600:---0----------38---------24 1360x768:--9----------62--------32 1920x1080:-0----------89--------35 (the 0 is from the loading pauses) odd ant it? im going to have to run further tests... (but some time later)
|
|