Graphic Cards NVIDIA GeForce 8600-Series Details Unveiled

Status
Not open for further replies.

Dark Star

Innovator
NVIDIA’s GeForce 8600GTS and 8600GT are G84-based GPUs while the GeForce 8500GT is G86-based. The budget-priced trio feature full support for DirectX 10 features including pixel and vertex shader model 4.0 as well as support for NVIDIA SLI and PureVideo technologies. NVIDIA touts three dedicated video engines on the G84 and G86-based graphics cards, providing MPEG-2 high-definition and WMV HD video playback up to resolutions of 1080p. G84 and G86 support hardware accelerated decoding of H.264 video as well (no mention of VC-1 decoding). G84 and G86 also feature advanced post-processing video algorithms. Supported algorithms include spatial-temporal de-interlacing, inverse 2:2, 3:2 pull-down and 4-tap horizontal, and 5-tap vertical video scaling.

GeForce 8600GTS PCIe x16

* Eight-layered PCB, measures 7.2†x 4.376â€
* 675 MHz GPU and a 128-bit bus
* 256MB of GDDR3 memory clocked at 1000 MHz
* Require external PCIe power; estimated total board power consumption: ~71-watts
* Supported video output connectors: include dual dual-link DVI, VGA, SDTV and HDTV outputs, and analog video inputs

Image :--


GeForce 8600GT PCIe x16

* Six-layer PCB, measures 6.9†x 4.376â€
* 540 MHz GPU and a 128-bit bus
* 128MB or 256MB of GDDR3 memory clocked at 700 MHz
* External PCIe power not required; maximum board power consumption: ~43-watts
* Supports similar video outputs as the 8600GTS, no video input features

Image :--

G84-based GPUs do not support a native HDMI output, manufacturers can adapt one of the DVI-outputs for HDMI. HDCP support is optional and up to NVIDIA’s add-in board partners. NVIDIA has revealed very little information on the GeForce 8500GT besides support for GDDR3 and DDR2 memory. It supports dual dual-link DVI, VGA and TV outputs as well. Expect NVIDIA to pull the wraps off its GeForce 8600GTS, 8600GT and 8500GT next quarter in time to take on AMD’s upcoming RV630 and RV610.
News source: DailyTech
 
  • Like
Reactions: 3 people
Well, the 256mb on the 8600GTS will be a major bottleneck in future games.

But hell, Where are the future games?

This is amazing power for the targetted price (200$ GT and 250$ GTS, I guess?)
 
  • Like
Reactions: 1 person
NVIDIA pairs the GeForce 8600GTS with 256MB of GDDR3 memory clocked at 1000 MHz. The memory interfaces with the GPU via a 128-bit bus.

The memory configuration has more flexibility, letting manufacturers decide between 256MB or 128MB of GDDR3 memory. NVIDIA specifies the memory clock at 700 MHz. The GeForce 8600GT shares the same 128-bit memory interface as the 8600GTS.

wtf?! 128 bit???

And no news about the 8600UL still..
 
Yeah if you wanna game at 1600 resolutions then 256mb will be a bottleneck, but should do fine on 1024 resolutions...

And the 128bit is obvious build a 256bit bus will increase the cost exponentially, bus width and RAM are the two most expensive techs on the gfx pcb...
 
well,high end cards are moving to 512bit bus,its logical for mainstream to move to 256bit.. Its been 5+years since we started to see 256bit bus..
It should be mainstream soon IMHO..
 
1000 Mhz x2 for GTS, so thats 675 core/2Ghz effective RAM. Thats pretty impressive, I'll need to get a couple and see how far they OC.

It looks like the 7600GT might be more powerful than the 8600GT with higher core clock, thats a little disappointing.
 
Nvidia's pride and joy for $200 will be based on 256MB of GDDR3 memory, either from Samsung or Qimonda. The clock for the GPU is set at 675MHz, while the memory works at a cool 1000MHz DDR - that's 2 GHz in marketing lingo, 200 MHz faster than on the 8800GTX. This is possible because G84 GPU uses 128-bit memory controller and does not load the memory hard, so memory errors are much likely to occur and partners will be able to clock them even faster than current default clock.

You can see that this part is eating a lot more power than previous 7600GT, so external power has to be connected via 6-pin PCIe connector, and PCB has been changed from classical 7600 one to a new, 8-layer heavy one.

The biggest SNAFU that Graphzilla has right now is the situation with HDCP, or the lack of it. G86 supports HDCP native, G84 has issues. Thus, partners will have to buy additional Encrypto-key EEPROM chips in order for GF8600GTS to have a support for practically any HD content out there.

GeForce 8600GT

GeForce 8600GT is your typical 150 Dollar/Euro solution, but offers quite a decent performance boost over 7600GT, yet alone 7600GS and similar variants. This one is based on typical 7600 PCB, so you can see affordable pricing due to low-cost six-layer PCB. Having a well-known PCB design to partners enabled those companies to be flexible with work speeds, so you will see a lot of "Golden Samples" coming in late April/May timeframe.

Memory clock has been cut down to 700 MHz DDR (1.4 GHz) and memory type is of course, GDDR3. GPU itself is clocked at 540 MHz, but being the same as one inside 8600GTS, your overclocking headroom is limited only by available power coming from the PCIe x16 slot (75W max). Power is designed around 43W mark, so GPU should be able to hit 600 MHz or more.

GeForce 8500GT

New Entry level model comes with 128MB or 256MB of memory, and partners said it is using 128-bit bus... which collides with some information we have from before

GeForce 8500GT is based on G86 chip, which made all the way to final A2 revision. Even though nV likes to cover all of the details about this entry-level GPU, we managed to find a lot of interesting details. Memory controller should fix the error company made with 7300 chips (in this area, G86 holds firm ground against RV610) and that wide bus will be connected to either GDDR3 and DDR2 memory (expect very few GDDR3 ones). Pricewise, company wants to sell 256 MB models for around 100 Dollar/Euro, while 128MB should retail for 79.

INQ strips down Nvidia 8800GT/GTS, 8500GT
 
  • Like
Reactions: 1 person
Talked with a grafix card dealer here nd he said tht 8600 will come by first week of may and will start at 14K...
 
Some initial scores ...

All numbers were measured on an Intel X6800 machine running the latest 3DMark.

The raw scores are 5500 3DMark06 for the GTS and 4800 for the GT, blurred slightly to protect the guilty. That is the good part. The bad part is when you start using heavy textures, performance drops off notably, and I mean notably. Think cliff.

This is most likely because of two things, drivers or bus width. NV drivers are still pretty badly broken, and upcoming cards are probably less of a priority than getting the ones they released almost 6 months ago functional.

The other thing it could be is the narrow memory bus simply choking on all the data. If this is the case, don't look for improvements, this card will always be a benchmark special.

8600GTS/GT benchmarked
 
NGOHQ drivers fixed a lot of problems with the Forceware drivers but Nvidia has been giving them problems for a while now... Certainly stopped my 7600s from crashing all the time, and I think they gave me a good 5-10 fps increase in SLi. Omega drivers are good too.

I don't know why Nvidia hasn't switched to a 256 bit bus, but it will be interesting to see how many pipelines they can fit on the 8600 GT/GTS. If it's 12-16 I'll be pissed. I think they will need a 8600 version (GTX?) in the 320-512mb range to really attract attention for these cards for the price. Maybe they'll make one once they realize that.
 
There are no longer pixel pipes,talk about stream processors and shaders :P

There wont be a 320Mb for other cards than 8800..

I think itll be better than 7900gs and within 7800gtx..
 
As far as I know overclocked version will start flooding market after ATI is done with its launch. Inq did mention this, I guess. But whateva gives... 1280X1024 seems OK with 8600GTS.. lets see if SLI with these babies can be as good as 8800GTX.
 
SLI=effective 256 bit interface & 512Mb RAM...

The higher clock speed should help greatly with performance.

A collection of floating point processors making up the GPU... interesting.
So the more flexible stream processors are individually less effective than individual units (pixel shaders, vertex shaders) but the immense number and higher clock can more than make up for it. Cool. How many on the 8600s? If it's 32-48 I'll be pissed.
 
You should be pissed then,last time i heard its 64 for GTS (or 54?) and 48 for GT..

SLI,i dont have high opinion about it.IMO its for people who buy alienware..
Now, where are the Go versions..
 
WE GOT UPDATES on the possible delay of G84 and G86. It looks like the April 17th date is on, but if they are going to launch with parts availabile on that day, you might be better off avoiding them.

The problem as we understand it is the 2D modes are not clocked down to where they should be. NV has a 2D clock that is a lot lower than the 3D clock which saves battery power and in general, makes things run cooler and quieter.

The bug we are told prevents them from clocking 2D down to a level lower than 3D increasing power substantially. Basically, when the parts should be clocked down in 2D where they spend most of their life, they instead run flat out.

Before you get one of these in a laptop or SFF machine, make sure this is not happening in the ones you buy.

Essentially it is game on, but beware Gen 1 parts. Now you know the why, not just the when.

G84 and G86 problems explained
 
Status
Not open for further replies.