LtCmdr_Tsusai wrote:Vinia wrote:Well that leads to a serious new point. It seems to be DirectX 10 or the new generation graphics cards way of dealing with it which doesn't bode well for peoples hopes of a fix for the subsequent series... I thought DirectX was supposed to be backwards compatible? If it is backwards compatible, then structure of the cards are preventing it... or something (note: not a tech head...)It has really nothing to do with DX10 completely, I run a DX10.1 ATi card (HD3850) on my portable game computer, and indoors run just fine. As for the 2600XT, beats me, you might want to driver clean your nvidia stuff off *shrug*For those who want to go "Oh! you're running 10.1!" It doesn't matter"The spec revision basically makes a number of things that are optional in DX10 compulsory under the new standard - such as 32-bit floating point filtering, as opposed to the 16-bit current. 4xAA is a compulsory standard to support in 10.1, whereas graphics vendors can pick and choose their anti-aliasing support currently." http://www.theinquirer.net/en/inqui...ith-101-upgradeRemember. DX9 in Vista is not DX9 in XP. Different files and the like. To even take advantage of 10.1 stuff, you have to develop the application for Vista using 10.1 API calls.Its still just the 8 series cards & their XP driver.Are you running Vista or XP SP2 Tsusai? As I ran a preliminary test with a HD3850 on XPSP2 and experienced some of the same issues as with the 8xxx series Nvidia cards.
Vinia wrote:Well that leads to a serious new point. It seems to be DirectX 10 or the new generation graphics cards way of dealing with it which doesn't bode well for peoples hopes of a fix for the subsequent series... I thought DirectX was supposed to be backwards compatible? If it is backwards compatible, then structure of the cards are preventing it... or something (note: not a tech head...)It has really nothing to do with DX10 completely, I run a DX10.1 ATi card (HD3850) on my portable game computer, and indoors run just fine. As for the 2600XT, beats me, you might want to driver clean your nvidia stuff off *shrug*For those who want to go "Oh! you're running 10.1!" It doesn't matter"The spec revision basically makes a number of things that are optional in DX10 compulsory under the new standard - such as 32-bit floating point filtering, as opposed to the 16-bit current. 4xAA is a compulsory standard to support in 10.1, whereas graphics vendors can pick and choose their anti-aliasing support currently." http://www.theinquirer.net/en/inqui...ith-101-upgradeRemember. DX9 in Vista is not DX9 in XP. Different files and the like. To even take advantage of 10.1 stuff, you have to develop the application for Vista using 10.1 API calls.Its still just the 8 series cards & their XP driver.
Well that leads to a serious new point. It seems to be DirectX 10 or the new generation graphics cards way of dealing with it which doesn't bode well for peoples hopes of a fix for the subsequent series... I thought DirectX was supposed to be backwards compatible? If it is backwards compatible, then structure of the cards are preventing it... or something (note: not a tech head...)
It has really nothing to do with DX10 completely, I run a DX10.1 ATi card (HD3850) on my portable game computer, and indoors run just fine. As for the 2600XT, beats me, you might want to driver clean your nvidia stuff off *shrug*For those who want to go "Oh! you're running 10.1!" It doesn't matter
"The spec revision basically makes a number of things that are optional in DX10 compulsory under the new standard - such as 32-bit floating point filtering, as opposed to the 16-bit current. 4xAA is a compulsory standard to support in 10.1, whereas graphics vendors can pick and choose their anti-aliasing support currently." http://www.theinquirer.net/en/inqui...ith-101-upgrade
Remember. DX9 in Vista is not DX9 in XP. Different files and the like. To even take advantage of 10.1 stuff, you have to develop the application for Vista using 10.1 API calls.Its still just the 8 series cards & their XP driver.
ok devs...what changes between inside and outside when it comes to rendering rules? Are "outside" walls different from "inside" walls?When an area is "indoors" does it attempt to render/load all textures/polygons "in" the "indoor" area.Obviously there is something...different about inside areas when compared to outside ones. I don't mean to sound harsh, but its pretty obvious this is where the rendering engine is having an issue. Not with DX10 cards as a whole or even with the 8800 series as a whole. The bug only shows up under specific conditions and is thus most likely inside the code and not inside the drivers?
Well im getting a Radeon IceQ3 3870 so its all ending for me!
You guys are looking at it as if the Devs haven't done anything with this. A lot of dev time has been wasted trying to find a setting that could get MxO to work with the 8000 series. Unfortunately, none of our devs know anything about the graphics engines, and are not trained to know how to fix it either. They've exhausted all possibilities on their end, and we have to hope Nvidia can find something on theirs.