IPB

Welcome Guest ( Log In | Register )


2 Pages V  < 1 2  
Reply to this topicStart new topic
> My new Rig, Xmas and reward money come together to make something buetiful
Roani52
post Feb 1 2009, 12:36 PM
Post #31


Artillery
Group Icon

Group: Members
Posts: 987
Joined: 22-October 06
From: Netherlands
Member No.: 90
Alliance: GDI
Favorite game: Tiberian Sun



lol... 206BW, still maximum resolution is the same. And I've already played Crysis tongue.gif
Of course there's the 'coolness'-factor, and for future games. I have done 1,5 years with this 8600GTS, and well... not good enough for much games (I want EVERYTHING at high).
I think the GTX295 will do for at least a few years.


--------------------

/\ TY Bittah! /\

If you've X-Fire... Just add me If you join 'WhatPulse', also join 'Alpha Squad'!!! (My Profile: 'Here')


"Computers crash, people die and relationships fall apart. The only thing you can do is taking a deep breath and reboot"
Go to the top of the page
 
: | +Quote Post
Sir Modsalot
post Feb 1 2009, 08:57 PM
Post #32


Wolverine
Group Icon

Group: Members
Posts: 280
Joined: 21-October 06
From: Everywhere and nowhere!
Member No.: 19
Alliance: GDI
Favorite game: Tiberium Wars



QUOTE (Roani52 @ Jan 31 2009, 02:32 AM) *
It doesn't have a 8-pin PCIe connector, which is required for all High-End GPUs (which suck up power).


Not really. The only graphics cards that require an 8-pin are the GTX 295, GTX 280, and Radeon HD4870X2. The 55nm GeForce GTX285 and GTX260 only need 2 6-pin connectors; same with the Radeon HD4850X2. Even after all that, graphics power requirements are quite exaggerated these days. It's entirely possible for my 400 watt Liberty to run a 9800GTX+ if I wanted and stay stable; it provides 40 amps combined and has the necessary dual PCI-e power connectors, something that no other 400 watt PSU has. That is, unless I also wanted to step up to a 90+-watt quad core.

This post has been edited by Sir Modsalot: Feb 1 2009, 09:02 PM


--------------------




QUOTE("DCoder at PPM")
There is no sanity left in this thread.


Time is an illusion. The only reason we accept it is to keep everything organized. The only real time is party time.
Go to the top of the page
 
: | +Quote Post
ChielScape
post Feb 2 2009, 07:33 PM
Post #33


SSM Launcher
Group Icon

Group: Members
Posts: 799
Joined: 21-October 06
From: In ur BIOS, steeln ur Megahurtz!
Member No.: 33
Alliance: GDI
Favorite game: Tiberian Sun



sir, make sure to get the very latest firmware for that 1.5TB. the prior to last instabricks it, and the one before that bricks it over a fewm weeks-months.


--------------------
Admin Note: Please refer to signature rules.
Go to the top of the page
 
: | +Quote Post
Sir Modsalot
post Feb 2 2009, 07:39 PM
Post #34


Wolverine
Group Icon

Group: Members
Posts: 280
Joined: 21-October 06
From: Everywhere and nowhere!
Member No.: 19
Alliance: GDI
Favorite game: Tiberium Wars



You act like I pay no attention to OCN's news posts; of course I know to update the firmware for it.


--------------------




QUOTE("DCoder at PPM")
There is no sanity left in this thread.


Time is an illusion. The only reason we accept it is to keep everything organized. The only real time is party time.
Go to the top of the page
 
: | +Quote Post
The DvD
post Feb 3 2009, 07:26 PM
Post #35


Webmaster
Group Icon

Group: Root Admin
Posts: 740
Joined: 27-May 06
From: The Netherlands
Member No.: 2
Alliance: Nod
Favorite game: Tiberian Sun



Roani, i stil think the GTX295 might be a pretty bad investment, especially for future games. On ultra-high settings it is already losing to the 6 month-older Radeon 4870X2. The smaller amount of video memory and nVidia's inferior memory management seems to blame. Also, history has shown that dual GPU solutions tend to be outperformed by newer and usually cheaper single GPU solutions as time passes and newer games are released. I think your best option at the moment is a Radeon 4870 with 1 GB. Or if you can find a good deal on a GTX280, that would be fine too. Their price has fallen significantly since the introduction of the slightly faster GTX285.


--------------------
Please contact me on msn if you need me for anything, thanks.
Go to the top of the page
 
: | +Quote Post
Roani52
post Feb 3 2009, 08:54 PM
Post #36


Artillery
Group Icon

Group: Members
Posts: 987
Joined: 22-October 06
From: Netherlands
Member No.: 90
Alliance: GDI
Favorite game: Tiberian Sun



First of all, I need to convince my dad that I actually DO need (well... want, but that doesn't matter) a new GFX.
I'm not feeling much for Chiel's method:
"1: disconnect VGA fan
2: play crysis at high
3: ???
4: PROFIT!"


--------------------

/\ TY Bittah! /\

If you've X-Fire... Just add me If you join 'WhatPulse', also join 'Alpha Squad'!!! (My Profile: 'Here')


"Computers crash, people die and relationships fall apart. The only thing you can do is taking a deep breath and reboot"
Go to the top of the page
 
: | +Quote Post
ChielScape
post Feb 4 2009, 08:04 PM
Post #37


SSM Launcher
Group Icon

Group: Members
Posts: 799
Joined: 21-October 06
From: In ur BIOS, steeln ur Megahurtz!
Member No.: 33
Alliance: GDI
Favorite game: Tiberian Sun



still think its a good method. you could also try the power cable and show it doesnt boot tongue.gif


--------------------
Admin Note: Please refer to signature rules.
Go to the top of the page
 
: | +Quote Post
Sir Modsalot
post Feb 6 2009, 08:26 PM
Post #38


Wolverine
Group Icon

Group: Members
Posts: 280
Joined: 21-October 06
From: Everywhere and nowhere!
Member No.: 19
Alliance: GDI
Favorite game: Tiberium Wars



QUOTE (The DvD @ Feb 3 2009, 12:26 PM) *
On ultra-high settings it is already losing to the 6 month-older Radeon 4870X2. The smaller amount of video memory and nVidia's inferior memory management seems to blame.


Only in Fallout 3 and high-res 3DMark Vantage tests have I seen the GTX295 lose to the Radeon 4870X2. I might also mention that the frame buffer on the 4870X2 can only be accessed by one GPU at a time, so it's simply mirrored between them. This means you can only effectively use 1GB of the X2's frame buffer. nVidia doesn't have inferior memory management, they simply realize that GDDR3 still has more than enough power to handle any of today's apps/games. Maybe in the new GTX3** series nVidia will move to GDDR5 but until they do, GDDR3 will do just fine. Also remember that driver updates will always mean a 2-5 frame per second difference in some games; I have a rule of thumb that official video card reviews involve testing with drivers that are at least 2 versions older than the newest ones available at the time the review is actually published, which means anywhere from a 3-10 frame per second margin of error in the reviews versus picking up a card and immediately updating it to the newest driver set.

This post has been edited by Sir Modsalot: Feb 6 2009, 08:31 PM


--------------------




QUOTE("DCoder at PPM")
There is no sanity left in this thread.


Time is an illusion. The only reason we accept it is to keep everything organized. The only real time is party time.
Go to the top of the page
 
: | +Quote Post
ChielScape
post Feb 6 2009, 09:23 PM
Post #39


SSM Launcher
Group Icon

Group: Members
Posts: 799
Joined: 21-October 06
From: In ur BIOS, steeln ur Megahurtz!
Member No.: 33
Alliance: GDI
Favorite game: Tiberian Sun



QUOTE (Sir Modsalot @ Feb 6 2009, 09:26 PM) *
I might also mention that the frame buffer on the 4870X2 can only be accessed by one GPU at a time, so it's simply mirrored between them.

this happens to NV's SLI-on-a-stick just as well as ATI's X-fire-on-a-stick. the 295 still has just half of what its got actually available and this will be the case untill either uses Hydra on their dualGPU cards.


--------------------
Admin Note: Please refer to signature rules.
Go to the top of the page
 
: | +Quote Post
Roani52
post Feb 6 2009, 09:55 PM
Post #40


Artillery
Group Icon

Group: Members
Posts: 987
Joined: 22-October 06
From: Netherlands
Member No.: 90
Alliance: GDI
Favorite game: Tiberian Sun



QUOTE (ChielScape @ Feb 4 2009, 09:04 PM) *
still think its a good method. you could also try the power cable and show it doesnt boot tongue.gif
I assume you mean the PCIe cable?

Oh, and another problem with the GTX295 is that it is in fact SLI, and SLI and multi monitor doesn't work in XP (probably because of the same reason why DX10 won't work in XP).
A friend of me has this problem. So, I might say something not completely true, if so, correct me.

This post has been edited by Roani52: Feb 6 2009, 09:57 PM


--------------------

/\ TY Bittah! /\

If you've X-Fire... Just add me If you join 'WhatPulse', also join 'Alpha Squad'!!! (My Profile: 'Here')


"Computers crash, people die and relationships fall apart. The only thing you can do is taking a deep breath and reboot"
Go to the top of the page
 
: | +Quote Post
ChielScape
post Feb 7 2009, 06:27 PM
Post #41


SSM Launcher
Group Icon

Group: Members
Posts: 799
Joined: 21-October 06
From: In ur BIOS, steeln ur Megahurtz!
Member No.: 33
Alliance: GDI
Favorite game: Tiberian Sun



QUOTE (Roani52 @ Feb 6 2009, 10:55 PM) *
I assume you mean the PCIe cable?

Oh, and another problem with the GTX295 is that it is in fact SLI, and SLI and multi monitor doesn't work in XP (probably because of the same reason why DX10 won't work in XP).
A friend of me has this problem. So, I might say something not completely true, if so, correct me.

hurrrrr durrrrrrr, use vista >_>


--------------------
Admin Note: Please refer to signature rules.
Go to the top of the page
 
: | +Quote Post
The DvD
post Feb 8 2009, 09:07 PM
Post #42


Webmaster
Group Icon

Group: Root Admin
Posts: 740
Joined: 27-May 06
From: The Netherlands
Member No.: 2
Alliance: Nod
Favorite game: Tiberian Sun



QUOTE (Sir Modsalot @ Feb 6 2009, 09:26 PM) *
nVidia doesn't have inferior memory management, they simply realize that GDDR3 still has more than enough power to handle any of today's apps/games.


They do. The memory type has nothing to do with the memory management. That's a driver thing. At this moment, nVidia's drivers just need a bit more video memory to draw the same scenes as ATi's drivers. There's also a difference with swapping, i believe. Anyway, it's not really important.. Both options are useless for Roani.

QUOTE (Roani52 @ Feb 6 2009, 10:55 PM) *
Oh, and another problem with the GTX295 is that it is in fact SLI, and SLI and multi monitor doesn't work in XP (probably because of the same reason why DX10 won't work in XP).
A friend of me has this problem. So, I might say something not completely true, if so, correct me.


SLI works just fine in XP, however nVidia might not pay much attention to their XP drivers anymore, since the benchmarks sites all use Vista. So for the newest games, SLi support might come later on XP compared to Vista.


--------------------
Please contact me on msn if you need me for anything, thanks.
Go to the top of the page
 
: | +Quote Post
Roani52
post Feb 9 2009, 01:58 PM
Post #43


Artillery
Group Icon

Group: Members
Posts: 987
Joined: 22-October 06
From: Netherlands
Member No.: 90
Alliance: GDI
Favorite game: Tiberian Sun



Maybe I should give it another try, installing Vista...
(The black taskbar is cool, I use windowblinds to give my XP a Vista skin... Downside: Performance suffers of windowblinds, time to find a Vista skin for StyleXP)


--------------------

/\ TY Bittah! /\

If you've X-Fire... Just add me If you join 'WhatPulse', also join 'Alpha Squad'!!! (My Profile: 'Here')


"Computers crash, people die and relationships fall apart. The only thing you can do is taking a deep breath and reboot"
Go to the top of the page
 
: | +Quote Post

2 Pages V  < 1 2
Reply to this topicStart new topic
1 User(s) are reading this topic (1 Guests and 0 Anonymous Users)
0 Members:

 

Lo-Fi Version Time is now: 4th July 2025 - 07:37 AM


XGhozt.com