Vraag & Antwoord
-= Next-generation videocard Topic =- Deel 1
- Ik wou hier graag al de [i:53caed98d7]"next-generation Videocard"[/i:53caed98d7] Rummors in kwijt
Hier al wat nieuws over nVidia's NV30, Ati's R300, 3dlabs P10 en andere toekomstige kaarten
[quote:53caed98d7]Since some sites are poking out info regarding the NV30, I might as well spill what I know. You might think this is just rumors, but this is straight from Nvidia who held a Seminar to some top end developers. So if these end up being wrong, blame Nvidia as they are the ones who held the seminar. Here is the info:
[i:53caed98d7]I caught a couple of NV30 specs for you. First the RAM will be running at 900Mhz. Secondly, they are claiming at this point 200 million polys per second.[/i:53caed98d7]
Will Nvidia be using the DDR II memory that is on Samsungs product description pages? Will have to wait for that info, but it's apparent that Nvidia is banking on massed produced 900Mhz modules from Samsung. The 200 million polys per second is almost doubled that of the GeForce 4. I believe the GeForce 4 is what? 124? or something around that. Here is some info as well about the R300 from a OpenGL developer:
[i:53caed98d7]I've also recently been chatting to one of the persons at ATi. He got his first R300 about 3 and a half weeks ago. He won't give me any performance numbers (yet). He's claiming that based on the performance he's been getting, it's going to destroy everything else that's coming out.[/i:53caed98d7]
Like I said from an ATi employee that's a given about the performance… [/quote:53caed98d7]
3dchipset.com staat er om bekend om NIET me te doen aan de Rummors die er altijd op het web worden gegooid. dus dat dit op hun site staat betekent voor mij dat er behoorlijk wat waarheid in zit.
900mhz memmory moet mogelijk zijn met DDRII tech.
My System |
[i:53caed98d7][b:53caed98d7]User formally known as GeForce Mike[/i:53caed98d7][/b:53caed98d7]
<font size=-1>[ Dit bericht is bewerkt door: Red Dragon op 2002-05-09 23:30 ]</font>
- Ja er wordt weer een ongelooflijke hoeveelheid rumors over oinze koppies gestrooid, maar vaak zit er wel wat waarheid in, spannende tijden weer.
ATI claimt met de R300 de snelste kaart op de markt te zetten, Matrox zegt met een kaart te komen die 2x zo snel zal worden dan de GF4Ti4600, 3Labs, komt met een GPU die veelbelovend klinkt
Dit zijn nog steeds ruime begrippen, want ze kunnen als referentie een bepaalde theoretische specs posten, die in realgames niet alles bepalend zijn.
Voorlopig is nVidia de enige die aan de top staat, en daadwerkelijk levert en GODlike drivers heeft, ik ben erg benieuwd naar de 14e, al lijkt er dan nog geen benchmarkfeestje te komen.
[ Dit bericht is bewerkt door: SkinnerEd op 2002-05-10 02:45 ]
- Tsja, het blijft gissen, maar je hebt best kans dat we op de 14e de eerste benches al gaan zien. Ze kondigen het niet voor niks zo groots aan :smile:
Op 10-05-2002 13:49 schreef FlvanSon:
het is idd Formerly: vroeger, eertijds, eerder, voorheen.
- David kirk over de NV30
[quote:ca6dd0dc18]I can't share any technical details on NV30 memory architecture, but I would answer to you pointing out that a wider data bus gives more bandwith, but with a cost: a 256 bits wide architecture is very complex and expensive too. High-end boards are very expensive. Ask to customers to pay a higher price to integrate a 256 bits wide bus it's an overkill at the moment[/quote:ca6dd0dc18]
- En zo gaan we vrolijk door met rummors:
[quote:2f4c6e96f9][b:2f4c6e96f9]Nvidia and ATI to showcase new NV18 and RV250 chips at Computex Taipei in June[/b:2f4c6e96f9]
War in the graphics chip market is expected to erupt again after mid-2002, as industry heavyweights Nvidia and ATI Technologies are set to introduce their new-generation cores, the NV18 and RV250, respectively, in late July.
According to the companies’ roadmaps, both Nvidia and ATI said that they will showcase NV18 and RV250-based chips at Computex Taipei early next month, and the chips will replace the current GeForce4 MX440 and Radeon 7500 as the companies’ mainstream products. Their top-end products, the DirectX 9-supporting NV30 and R300 graphics cores, will not be launched until the end of the third quarter at the earliest.
ATI claimed that its RV250-core chips will be on market shelves in late July, and both its RV250 and R300-based products will hit the market earlier than Nvidia’s NV18 and NV30 chips. However, sources from downstream companies revealed that the two designers’ product launch schedules are not likely to be very different.
Industry sources said that with product transition becoming faster, the chips’ ASP (average selling price) has experienced declines at a much quicker pace. As a result, an intensive rollout schedule is now regarded as a profit-maintaining tool to many chip designers. With the profits received during the initial stage of the new product launch, the companies may be better able to face the fierce price competition later.
Besides Nvidia and ATI, graphics chip designer Matrox Graphics also plans to introduce its new-generation G1000 series. The Montreal, Canada-based designer will launch two products, the Millennium G1000 and Millennium G1000+, for the desktop segment and two for the workstation segment.
Hier nog een keer de NV18 core:
- Me [b:a603df1869]3000ste[/b:a603df1869] post wou ik besteden aan de NV30 rummors :grin: :grin:
gelezen op http://www.3dchipset.com:
[quote:a603df1869]Noticed this over at the x3dfx message board and felt like commenting on it. At first glance, the biggest thing I see wrong here, is the transistor count. The Ti-500 had 57M the Ti4600 had 63M and only added an extra shader.. 73 million transistors seems way too low for this list of specs. However, with the T&L unit no longer in the GPU, that would lower the amount of transistors. Next.. the 900MHx memory I had and have a tough time swallowing, this 750 seems a little more reasonable, although I was going for 800MHz as my first thoughts for the NV30. Glide???,. who knows lol that's just silly if you ask me, but would be easy enough to implement AND It would bring more of the last few 3dfx'ers into the NVIDIA domain, however the name NvBlur?.. That seems a little lame eh? hah. Although the majority of 3dfx users that I know are already planning on the NV30 because of the 3dfx tech to be in it.
Anyway, this is actually old and regurgitated rumor mill stuff, but I thought you might find it interesting or at least entertaining. Hey when you have nothing to lose and nothing to gain, why not make a post that has no known fact to it? lol
-AGP 4x and 8x
-Copper Core GPU Processor 0.13 m default clock 450 MHz
-73 million transistors
-128 MB DDR-SDRAM @ 375 MHz x2 = 750Mhz and 256 bit yes 256 bit
-DirectX 9.x compatible
-Quad Vertex Shaders
-Dual Pixel Shaders
-Lightspeed Memory Architecture III
-Quad Cache memory caching subsystem
-Dual Z-buffers for greater compression without losses visibility subsystem
-Advanced Hardware Anisotropic Filtering - A new 12nvx mode that delivers improved subpixel coverage, texture placement and quality.
-12nvx mode should deliver 80% better subpixel coverage and placement than previous modes
-4 Textures/Pixel in a single clock cycle, loop back can also apply an additional 4 textures per rendering pass
-A separate second copper processor running at 450Mhz an 0.13m as well - handle all T&L now called TT&L (True Time and lighting)
-NVAutoShaper a new prechached and precacheable system
-Programmers can also add shapes to be temporarily stored in precache to speed things up
-Backward glide game capable thanks to NvBlur (Yes a new version of Glide!)
-NView Multi-display Technology
Rumors running wild because theres nothing better to do…[/quote:a603df1869]
zo dat was het weer… een hoop onzin natuurlijk, de echte specs zullen we pas over een paar maanden zien denk ik.
- Ik heb hier andere specs :lol:
400MHz GPU - 8 pipeline engine - .13Micron
800~1000MHz DDR / QDR memory
LMA3 on 128MB~256MB Video Memory.
DirectX9 - OpenGL 1.3
500MHz GPU - 8 pipeline engine - .13Micron
1000~1200MHz DDR / QDR memory
LMA3 on 128MB~256MB Video Memory.
DirectX 9.1 - OpenGL 2.0
I'll go through these GPUs in a bit of detail with everyone,
-First, the NV30 is a brand new architecture. That means no more GeForce title, and no more internal guts which were repeated all the way through the GeForce architecture.
-The GeForce 4 was 256-bit, while the NV30 is 512-bit.
-The GeForce 4 was built on .15Micron technology while the NV30 is built on .13Micron Technology.
-GeForce4 is limited to 4096x4096 texture sizes while the NV30 has an infinite maximum texture size meaning 16384x16384 and higher can now be possible.
-NV30 also supports future resolutions of up to 2560x1920 with 64-bit colour (These are 100% rumors, so dont be let done if the final product doesnt support 64-bit colour). The whole rumor of 64-bit started when John Carmack back almost two years ago opened his mouth about Doom3's 64-bit support through the help of next generation Nvidia architecture.
-GeForce 4 features AGP4X while NV30 has full AGP8X support.
-NV30 supports memory bandwidth up to 16GB/s (1GHz).
-PCI-X support (1GB/s) the advanced PCI interface thats an alternative to AGP8X (2GB/s).
-This product is expected to be annouced in August 2002 and is expected to hit store shelves by late september, early october.
As the CEO announces this wonderful new revolutionary GPU advancement, consumers countinue to purchase inferrior products such as GeForce4.
Now, I'll go through the key new features that NV35 will bring;
-NV35 will hit the 500MHz barriar on .13Micron technology. To do this Nvidia is removing the T&L unit from the GPU and putting it external. This allows for more heat spreader and better heat transfer from the GPU.
-By removing the T&L unit, Nvidia is building on a new technology called TT&L which runs at 3/2 the GPU speed. (EX: Your GPU ran at 200MHz, the T&L unit would run at 300MHz) In NV35 case, the T&L unit will now run dedicated at 758MHz.
-Expected memory bandwidth of the NV35 is between 20~24GB/s.
-3DFX SLI technology will be implemented in higher models (A.K.A Workstation models)
-This product is expected to be annouced in January 2003, shipping by March.
- Nog meer onzin (al zou ik het niet erg vinden om 1200 MHz RAM op me videokaart te hebben :grin: )
- Leuk nog meer Bullshit.. vooral die NV35 specs zijn leuk :grin: :grin:
de een zegt 256bit mem bus de ander weer 1ghz DDR.
Ik vind die 256bit bus realistischer..
laat maar komen die onzin specs. ik geniet er wel van.
- NV18 goes DirectX 8:
GRAPHICS CHIP GIANT Nvidia is already working on its NV18 and will time its introduction to hit ATI with its RV250, which is being deliberately postponed for maximum impact, the INQUIRER has learned.
Here's some background to this. We've asked Nvidia quite a few questions about Geforce 4MX support and future support for some important up-and-coming games.
NV17, we suggested, should be a DirectX 8 part because quite a few of these games will require that kind of support for a good end user experience.
One senior Nvidia executive told us that we can expect the nextgen MX products to be DirectX8 compatible.
This seems quite a logical step to us and we're speculating that a new MX could have more than two pipelines, with perhaps four the likely option, as NV30 would appear to be shooting for eight.
The MX is always a cut down version of Nvidia chips, and that makes it cheap and cheerful, but at the same time the MX family is the bread earner, not the extremely expensive Geforce 4 TI 4600 at the high end.
DirectX 8 hardware is about to become a necessary component because of the long awaited games coming up, so we think Nvidia may well introduce this new MX part by the end of the year. It would also make sense to assume that this chip will be launched side by side with the NV30, that we expect to see being introduced in late summer or early autumn.
As the SIS Xabre cards are already DirectX 8 compatible and almost ready for market we can clearly say that the time of affordable DirectX 8 hardware has come. But we will also see DirectX 9 products and hardware from Matrox in the shape of the Parhelia, the NV30, and RV250, we think. µ[/quote:1e73c39a12]
- Dus nV is slachtoffer geworden van haar eigen hype voor DirectX 8 hardware…..
ik had niet verwacht dat de GF4MX zo snel al vervangen zou worden. Of de GF4MX moet de absolute low-end gaan worden.
- Misschien dat de GeForce 4 MX de huidige GeForce 2 MX lijn gaat vervangen.. maar ze moeten wel. de Sis Xabre levert ook DirectX8 en de RV250 word waarschijnlijk ook een budget DirectX 8 kaart en geen opvolger van de R8500.
- Nog wat gevonden over de NV30… BRON
[quote:8f5eaea0ef]By Fuad Abazovic, 10/04/2002 11:39:11 BST
IF NVIDIA IS still taking advice on NV30, we have a little contribution which it might like to bear in mind.
When the firm introduced Geforce 3 and then the TI series of cards we learned that some Nvidia staffers should try harder to provide better picture quality, particularly for 2D quality for run-of-the-mill desktop applications.
With GeForce 4, or rather when NV25 was still being discussed, it took some outside feedback on board and the cards have better picture quality for 2D than on previous introductions, but we feel it is still lagging a little behind its competition.
Nvidia itself told us that it had performed some tests with outside people in focus groups, and when they compared Nvidia and ATI cards on similar displays, side by side, many reacted by saying the GeForce 4 picture was either better or just as good.
You can tell that GeForce 4 has better picture quality than Nvidia's previous cards, and especially its highly successful GeForce 2 MX.
It's possible to tell whether a system is using GeForce 2MX without knowing what's inside a PC.
We think that although Nvidia holds the performance crown with its Geforce 4 Ti cards, Radeons still have a better picture although the Gainward Golden Sample 750 has a noticeably better picture than other GeForce 4 implementations we've seen.
The NV30 is a chip where we should be able to see some important impact by ex-3DFX engineers. These cards were renowned for good picture quality.
ATI, on the other hand, needs to work on its 3D performance, as its 2D picture quality is pretty good, but we'll be interested to see what the up and coming R300 brings to the party.[/quote:8f5eaea0ef]
Volgens deze informatie denk je dat ze meer energie gaan stoppen in de 2D prestaties bij het maken/ ontwerpen van de NV30. Ik hoor hier graag wat over van jullie…
- Ik vind de beeldkwaliteit van de GF4 best wel goed. Werk nu al een hele tijd met een Leadtek GF4Ti4600 en daar valt absoluut niks op aan te merken. Een ATI is beslist niet beter. Wel vind ik de kleuren van de Matrox G550 beter dan die van de Leadtek.
Het is voor mij dan ook Matrox en de rest.
Aan de andere kant vind ik het fenomeen 3dfx tech ook weer een kreet die verzonnen is door de verschillende hardware sites. Net alsof nVidia op een doodspoor zat en niks meer had kunnen verzinnen…
De NV30 wordt een wrede kaart en ik moet nog maar zien of ATI zo veel beter is… Laat ATI eerst maar eens een antwoord geven op de GF4Ti voordat ze over de NV30 praten. Tenslotte zijn ze nu niet bepaald betrouwbaar gebleken gezien de beloftes die gedaan zijn. Hardware maken is één, passende/werkende software is deel 2. Beiden zijn nodig om een product succesvol te maken.
- Op reactor creatical vond ik de volgende news submit:
Apparently NVidia is going to introduce the new versions of its graphics GeForce4-family processors at Computex, Taipei. According to unofficial facts we learned from Hardware.Fr, the new GPUs will have only one difference compared to the predecessors: AGP 3.0 support.
SiS has been demonstrating its SiS648 since CeBIT (see this news-story for details) and is possibly going to officially announce it at Computex. VIA Technologies has recently announced the Apollo P4X333 supporting DDR333 and AGP 8X. Therefore, motherboards that will have AGP 8X in the specs will definitely hit the stores sometimes after Computex. In fact, SiS Xabre is the only GPU that have AGP 8X declared (see this news-story for details), so, it is quite logical for NVidia to make formal announcement of AGP 3.0 GPUs, for instance, NV18 and NV28 even if they are simple GeForce4 MX and GeForce4 Ti accordingly. We believe, it is also quite possible that the new solutions will have some other improvements as well.
In fact, AGP 3.0 is going to be massive this year: Matrox Parhelia-512, 3Dlabs P10, Nvidia NV30 and ATI R300 will definitely need a higher AGP bandwidth to perform well. Keeping in mind Taiwanese core-logic developers, we can also note that AMD, NVidia and Intel are likely to launch their chipsets with AGP 8X support this Fall or in the early 2003.
Coming back to Computex we should admit that it will be quite interesting this year: NVidia is going to showcase its NV18/NV28 GPUs, ATI Technologies is likely to announce their RV250 and possibly also R300, or, at least show the latter behind the closed doors (see this news-story for details).
Artikel spreekt voor zich, de PR machine draait weer op volle toeren om ons van de noodzaak van AGP 8x te overtuigen.
Blijkbaar moeten wij nog een generatie wachten voordat AGP 8x nuttig zal blijken te zijn.
- Vergeet de GF4. Sparen voor een nv35!!!!
- Zal weer oud nieuws zijn, maar:
NV30 This August
[i:284355d5e9]Speaking today at a chip conference in San Francisco, NVIDIA President and CEO Jen-Hsun Huang reiterated that a new graphics chip from NVIDIA is slated to arrive in August.
[b:284355d5e9]The new chip will be manufactured on Taiwan Semiconductor Manufacturing Co.'s latest 0.13-micron manufacturing process, Huang said. Huang did not reveal the name or specific features of the chip, but did say it was a fundamentally new architecture from the GeForce 4 Titanium introduced earlier this year.[/b:284355d5e9]
Huang also commented on the arbitration with Microsoft over the pricing of Xbox components, issues with the Securities and Exchange Commission, the MacIntosh market, and the follow-on to the nForce chipset. Thanks for the link Ryan.[/i:284355d5e9]
Source: Pinkeltje goes Hollywood
Beantwoord deze vraag
Dit is een gearchiveerde pagina. Antwoorden is niet meer mogelijk.