Vraag & Antwoord

Videokaarten en monitoren

-= Next-generation videocard Topic =- Deel 1

335 antwoorden
  • [img:0a6ac569d4]http://www.plaudersmilies.de/elaine.gif[/img:0a6ac569d4]
  • LOL :D
  • Ik heb em al gepréorderd :D :lol: :D :) 8) :P
  • Vergeet de GF4. Sparen voor een nv35!!!! :D
  • Zal toch haast wel? Ik kan me niet voorstellen dat ATI met een DX9 kaart gaat komen en nV dan met een GF4 on steriods komt… :???:
  • Ze zitten nu wel D3D aan het aanprijzen, maar met doom3 draait het om openGL, hopelijk is de nv30 daar goed in :-?
  • Ik wou hier graag al de [i:53caed98d7]"next-generation Videocard"[/i:53caed98d7] Rummors in kwijt

    Hier al wat nieuws over nVidia's NV30, Ati's R300, 3dlabs P10 en andere toekomstige kaarten

    [quote:53caed98d7]Since some sites are poking out info regarding the NV30, I might as well spill what I know. You might think this is just rumors, but this is straight from Nvidia who held a Seminar to some top end developers. So if these end up being wrong, blame Nvidia as they are the ones who held the seminar. Here is the info:

    [i:53caed98d7]I caught a couple of NV30 specs for you. First the RAM will be running at 900Mhz. Secondly, they are claiming at this point 200 million polys per second.[/i:53caed98d7]

    Will Nvidia be using the DDR II memory that is on Samsungs product description pages? Will have to wait for that info, but it's apparent that Nvidia is banking on massed produced 900Mhz modules from Samsung. The 200 million polys per second is almost doubled that of the GeForce 4. I believe the GeForce 4 is what? 124? or something around that. Here is some info as well about the R300 from a OpenGL developer:

    [i:53caed98d7]I've also recently been chatting to one of the persons at ATi. He got his first R300 about 3 and a half weeks ago. He won't give me any performance numbers (yet). He's claiming that based on the performance he's been getting, it's going to destroy everything else that's coming out.[/i:53caed98d7]

    Like I said from an ATi employee that's a given about the performance… [/quote:53caed98d7]


    3dchipset.com staat er om bekend om NIET me te doen aan de Rummors die er altijd op het web worden gegooid. dus dat dit op hun site staat betekent voor mij dat er behoorlijk wat waarheid in zit.

    900mhz memmory moet mogelijk zijn met DDRII tech.

    My System |
    [i:53caed98d7][b:53caed98d7]User formally known as GeForce Mike[/i:53caed98d7][/b:53caed98d7]

    <font size=-1>[ Dit bericht is bewerkt door: Red Dragon op 2002-05-09 23:30 ]</font>
  • Ja er wordt weer een ongelooflijke hoeveelheid rumors over oinze koppies gestrooid, maar vaak zit er wel wat waarheid in, spannende tijden weer.

    ATI claimt met de R300 de snelste kaart op de markt te zetten, Matrox zegt met een kaart te komen die 2x zo snel zal worden dan de GF4Ti4600, 3Labs, komt met een GPU die veelbelovend klinkt

    Dit zijn nog steeds ruime begrippen, want ze kunnen als referentie een bepaalde theoretische specs posten, die in realgames niet alles bepalend zijn.

    Voorlopig is nVidia de enige die aan de top staat, en daadwerkelijk levert en GODlike drivers heeft, ik ben erg benieuwd naar de 14e, al lijkt er dan nog geen benchmarkfeestje te komen.

    [ Dit bericht is bewerkt door: SkinnerEd op 2002-05-10 02:45 ]
  • Tsja, het blijft gissen, maar je hebt best kans dat we op de 14e de eerste benches al gaan zien. Ze kondigen het niet voor niks zo groots aan :smile:
  • [offtopic]
  • [quote:6e8b055cc1]
    Op 10-05-2002 13:49 schreef FlvanSon:

    thnx :grin:

    het is idd Formerly: vroeger, eertijds, eerder, voorheen.
  • David kirk over de NV30

    [quote:ca6dd0dc18]I can't share any technical details on NV30 memory architecture, but I would answer to you pointing out that a wider data bus gives more bandwith, but with a cost: a 256 bits wide architecture is very complex and expensive too. High-end boards are very expensive. Ask to customers to pay a higher price to integrate a 256 bits wide bus it's an overkill at the moment[/quote:ca6dd0dc18]

    Bron: http://www.nvnews.net/
  • op E-bay aangeboden:


    zijn nog mensen die er op geboden hebben ook :roll:

  • En zo gaan we vrolijk door met rummors:

    [quote:2f4c6e96f9][b:2f4c6e96f9]Nvidia and ATI to showcase new NV18 and RV250 chips at Computex Taipei in June[/b:2f4c6e96f9]
    War in the graphics chip market is expected to erupt again after mid-2002, as industry heavyweights Nvidia and ATI Technologies are set to introduce their new-generation cores, the NV18 and RV250, respectively, in late July.

    According to the companies’ roadmaps, both Nvidia and ATI said that they will showcase NV18 and RV250-based chips at Computex Taipei early next month, and the chips will replace the current GeForce4 MX440 and Radeon 7500 as the companies’ mainstream products. Their top-end products, the DirectX 9-supporting NV30 and R300 graphics cores, will not be launched until the end of the third quarter at the earliest.

    ATI claimed that its RV250-core chips will be on market shelves in late July, and both its RV250 and R300-based products will hit the market earlier than Nvidia’s NV18 and NV30 chips. However, sources from downstream companies revealed that the two designers’ product launch schedules are not likely to be very different.

    Industry sources said that with product transition becoming faster, the chips’ ASP (average selling price) has experienced declines at a much quicker pace. As a result, an intensive rollout schedule is now regarded as a profit-maintaining tool to many chip designers. With the profits received during the initial stage of the new product launch, the companies may be better able to face the fierce price competition later.

    Besides Nvidia and ATI, graphics chip designer Matrox Graphics also plans to introduce its new-generation G1000 series. The Montreal, Canada-based designer will launch two products, the Millennium G1000 and Millennium G1000+, for the desktop segment and two for the workstation segment.


    Hier nog een keer de NV18 core:

  • Me [b:a603df1869]3000ste[/b:a603df1869] post wou ik besteden aan de NV30 rummors :grin: :grin:

    gelezen op http://www.3dchipset.com:

    [quote:a603df1869]Noticed this over at the x3dfx message board and felt like commenting on it. At first glance, the biggest thing I see wrong here, is the transistor count. The Ti-500 had 57M the Ti4600 had 63M and only added an extra shader.. 73 million transistors seems way too low for this list of specs. However, with the T&L unit no longer in the GPU, that would lower the amount of transistors. Next.. the 900MHx memory I had and have a tough time swallowing, this 750 seems a little more reasonable, although I was going for 800MHz as my first thoughts for the NV30. Glide???,. who knows lol that's just silly if you ask me, but would be easy enough to implement AND It would bring more of the last few 3dfx'ers into the NVIDIA domain, however the name NvBlur?.. That seems a little lame eh? hah. Although the majority of 3dfx users that I know are already planning on the NV30 because of the 3dfx tech to be in it.

    Anyway, this is actually old and regurgitated rumor mill stuff, but I thought you might find it interesting or at least entertaining. Hey when you have nothing to lose and nothing to gain, why not make a post that has no known fact to it? lol

    Nvidia NV30:
    -AGP 4x and 8x
    -Copper Core GPU Processor 0.13 m default clock 450 MHz
    -73 million transistors
    -8 pipes
    -128 MB DDR-SDRAM @ 375 MHz x2 = 750Mhz and 256 bit yes 256 bit
    -370Mhz RAMDAC
    -DirectX 9.x compatible
    -Quad Vertex Shaders
    -Dual Pixel Shaders
    -Lightspeed Memory Architecture III
    -Quad Cache memory caching subsystem
    -Dual Z-buffers for greater compression without losses visibility subsystem
    -Advanced Hardware Anisotropic Filtering - A new 12nvx mode that delivers improved subpixel coverage, texture placement and quality.
    -12nvx mode should deliver 80% better subpixel coverage and placement than previous modes
    -Hardware bumpmapping
    -4 Textures/Pixel in a single clock cycle, loop back can also apply an additional 4 textures per rendering pass
    -A separate second copper processor running at 450Mhz an 0.13m as well - handle all T&L now called TT&L (True Time and lighting)
    -NVAutoShaper a new prechached and precacheable system
    -Programmers can also add shapes to be temporarily stored in precache to speed things up
    -Latest OpenGL
    -Backward glide game capable thanks to NvBlur (Yes a new version of Glide!)
    -NView Multi-display Technology

    Rumors running wild because theres nothing better to do…[/quote:a603df1869]

    zo dat was het weer… een hoop onzin natuurlijk, de echte specs zullen we pas over een paar maanden zien denk ik.

  • Ik heb hier andere specs :lol:

    400MHz GPU - 8 pipeline engine - .13Micron
    512-bit architecture
    800~1000MHz DDR / QDR memory
    400MHz Ramdac
    LMA3 on 128MB~256MB Video Memory.
    DirectX9 - OpenGL 1.3

    500MHz GPU - 8 pipeline engine - .13Micron
    512-bit architecture
    1000~1200MHz DDR / QDR memory
    400MHz Ramdac
    LMA3 on 128MB~256MB Video Memory.
    TT&L Technology
    DirectX 9.1 - OpenGL 2.0

    I'll go through these GPUs in a bit of detail with everyone,

    -First, the NV30 is a brand new architecture. That means no more GeForce title, and no more internal guts which were repeated all the way through the GeForce architecture.
    -The GeForce 4 was 256-bit, while the NV30 is 512-bit.
    -The GeForce 4 was built on .15Micron technology while the NV30 is built on .13Micron Technology.
    -GeForce4 is limited to 4096x4096 texture sizes while the NV30 has an infinite maximum texture size meaning 16384x16384 and higher can now be possible.
    -NV30 also supports future resolutions of up to 2560x1920 with 64-bit colour (These are 100% rumors, so dont be let done if the final product doesnt support 64-bit colour). The whole rumor of 64-bit started when John Carmack back almost two years ago opened his mouth about Doom3's 64-bit support through the help of next generation Nvidia architecture.
    -GeForce 4 features AGP4X while NV30 has full AGP8X support.
    -NV30 supports memory bandwidth up to 16GB/s (1GHz).
    -PCI-X support (1GB/s) the advanced PCI interface thats an alternative to AGP8X (2GB/s).
    -This product is expected to be annouced in August 2002 and is expected to hit store shelves by late september, early october.

    As the CEO announces this wonderful new revolutionary GPU advancement, consumers countinue to purchase inferrior products such as GeForce4.

    Now, I'll go through the key new features that NV35 will bring;
    -NV35 will hit the 500MHz barriar on .13Micron technology. To do this Nvidia is removing the T&L unit from the GPU and putting it external. This allows for more heat spreader and better heat transfer from the GPU.
    -By removing the T&L unit, Nvidia is building on a new technology called TT&L which runs at 3/2 the GPU speed. (EX: Your GPU ran at 200MHz, the T&L unit would run at 300MHz) In NV35 case, the T&L unit will now run dedicated at 758MHz.
    -Expected memory bandwidth of the NV35 is between 20~24GB/s.
    -3DFX SLI technology will be implemented in higher models (A.K.A Workstation models)
    -This product is expected to be annouced in January 2003, shipping by March.
  • Nog meer onzin (al zou ik het niet erg vinden om 1200 MHz RAM op me videokaart te hebben :grin: )
  • Leuk nog meer Bullshit.. vooral die NV35 specs zijn leuk :grin: :grin:

    de een zegt 256bit mem bus de ander weer 1ghz DDR.
    Ik vind die 256bit bus realistischer..

    laat maar komen die onzin specs. ik geniet er wel van.

  • NV18 goes DirectX 8:

    GRAPHICS CHIP GIANT Nvidia is already working on its NV18 and will time its introduction to hit ATI with its RV250, which is being deliberately postponed for maximum impact, the INQUIRER has learned.
    Here's some background to this. We've asked Nvidia quite a few questions about Geforce 4MX support and future support for some important up-and-coming games.

    NV17, we suggested, should be a DirectX 8 part because quite a few of these games will require that kind of support for a good end user experience.

    One senior Nvidia executive told us that we can expect the nextgen MX products to be DirectX8 compatible.

    This seems quite a logical step to us and we're speculating that a new MX could have more than two pipelines, with perhaps four the likely option, as NV30 would appear to be shooting for eight.

    The MX is always a cut down version of Nvidia chips, and that makes it cheap and cheerful, but at the same time the MX family is the bread earner, not the extremely expensive Geforce 4 TI 4600 at the high end.

    DirectX 8 hardware is about to become a necessary component because of the long awaited games coming up, so we think Nvidia may well introduce this new MX part by the end of the year. It would also make sense to assume that this chip will be launched side by side with the NV30, that we expect to see being introduced in late summer or early autumn.

    As the SIS Xabre cards are already DirectX 8 compatible and almost ready for market we can clearly say that the time of affordable DirectX 8 hardware has come. But we will also see DirectX 9 products and hardware from Matrox in the shape of the Parhelia, the NV30, and RV250, we think. µ[/quote:1e73c39a12]

  • Dus nV is slachtoffer geworden van haar eigen hype voor DirectX 8 hardware…..
    ik had niet verwacht dat de GF4MX zo snel al vervangen zou worden. Of de GF4MX moet de absolute low-end gaan worden.

Beantwoord deze vraag

Dit is een gearchiveerde pagina. Antwoorden is niet meer mogelijk.