Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

GeForce 256

  • 03-12-1999 1:34pm
    #1
    Moderators, Science, Health & Environment Moderators Posts: 9,035 Mod ✭✭✭✭


    Read some reviews that were mentioned on Blues. Seriously good card. Think I'll live without it though. Until I get my y2k £10000 loyalty bonus next april smile.gif Woohoo.

    M


Comments

  • Closed Accounts Posts: 124 ✭✭Creeper


    The GeForce will not give you better performance in todays games than say a Voodoo 3 3500 or a TNT2 Ultra.
    The GeForce has a Transformation and Lighting engine built in (similar to AMD's 3DNOW and Intels new MMX in PIII). Software has to be specially written to take advantage of this. DirectX7 supports hardware T&L so most new games should have some kind of support.
    So don't expect amazing performance at the moment (although it's nearly as fast as a TNT2). Also, make sure you get a DDR Ram version, the other one hits the memory bottleneck at a low fps.


  • Closed Accounts Posts: 6,275 ✭✭✭Shinji


    That, my friend, is the biggest heap of uninformed sh1te that has been posted on these boards since... well... urr, did anyone ever post a picture of that mound of Triceratops dung from Jurassic Park?

    The T&L engine in the GeForce (also appearing in the Savage2000) will not be utilised by games which already do their own rather than depending on DirectX for it. This includes Quake3 and UT and the likes, but a lot of other games do rely on DX for their T&L and thus will experience speedups - considerable ones - from the inclusion of the T&L engine, which DX hooks into automatically.

    However, even ignoring the T&L engine, the GeForce256 is a considerably faster card in terms of sheer poly-pushing and texture handling than anything on the market. I don't have exact figures to hand, but it is significantly faster than a TNT2 Ultra, and thats with the T&L disabled.

    When running through Creative's Universal GLIDE system, as I already stated, it gets better performance in Q3 or UT than a V3-3000 AGP. GLIDE has no T&L components (being a cut-down version of the OpenGL spec in essence) and therefore this has no effect on that benchmark.

    Engage your brain before posting next time, and don't make stoopid assumptions about things. I should also point out that you blatantly don't have the first clue what you're on about, given that you seem to think that 3DNow and the new MMX spec are similar to a hardware T&L engine...


  • Registered Users, Registered Users 2 Posts: 11,446 ✭✭✭✭amp


    Rob, you know far too much smile.gif

    [Insert cool quote here]
    Play GLminesweeper!
    http://www.iol.ie/~adamj/dl/mineswp.jpg



  • Closed Accounts Posts: 6,275 ✭✭✭Shinji


    lol, I wish! smile.gif

    imo, I know far too little, but what I do know I'm paid to know and therefore I know it well smile.gif


  • Registered Users, Registered Users 2 Posts: 2,010 ✭✭✭Dr_Teeth


    Hmm, looks like someone shat in Rob's porridge this morning. smile.gif

    Those vids rock btw, I'm about half-way through the first one atm.

    Teeth.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 1,134 ✭✭✭Chaos


    i see why peewee slags most of you off now.
    Goin to the scifi convention this week? tongue.gif

    www.gibworld.com/chaos
    Ps 0wns U



  • Closed Accounts Posts: 2,972 ✭✭✭SheroN


    Theres klingons on the starboard bow there lads....warp speed 7....engage.


  • Closed Accounts Posts: 6,275 ✭✭✭Shinji


    <grin> thought you'd like those vids Daire smile.gif

    And nobody shat in my porridge, I just didn't have any and had a wild row with my bank manager instead smile.gif


  • Closed Accounts Posts: 2 Flea


    Just pointing out a few inaccuracies in the above:

    - To make use of the T&L engine a game will need to use the OpenGL/Direct3D transformation pipeline (3 matrices - world, view & projection)
    - All DX 7+ and any OpenGL games (including every release of Quake) that do the above will use the T&L engine

    Fact.

    [This message has been edited by Flea (edited 03-12-1999).]


  • Registered Users, Registered Users 2 Posts: 2,894 ✭✭✭TinCool


    I know this has absolutely nothing to do with quake but it's this board which gets most of the replies. Are these new graphics cards any good ? I've read the specs and they do certainly sound a bit groovey. Would there be much of a performance improvement between a v3 2000 pci and one of these bad boys ? there bloody well be, because the price tag in dixons (i know they're inflated over say the like of game) was about £230 if I remember correctly. Anyone got any opinions on this ?


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 488 ✭✭Sonic


    3d prophet ddr-dv1 geforce 256 32mb ddr memory works out at £275 punts but it roxors seriously (not been reviewed yet but going on the comparison of ddr to sdram)

    failing that

    winfast geforce 256 32mb sdram agp works out at £200 punts , serious clocking potential on this baby http://www6.tomshardware.com/graphic/99q4/991125/index.html


    [This message has been edited by Sonic (edited 03-12-1999).]


  • Closed Accounts Posts: 6,275 ✭✭✭Shinji


    Last figures I saw said that if you run the Creative Uinversal GLIDE drivers on a GeForce256, you get better performance than a V3-3000 AGP running GLIDE natively.

    So imagine the native DirectX/OpenGL performance...

    A decent machine with one of these babies outspecs an SGI O2 workstation on most tasks.

    Ja,
    Rob


  • Closed Accounts Posts: 496 ✭✭Bunny



    hahahah such ******...do you people even have a clue what you are talking about??? I sympathise that you really try hard to understand the most remedial basic fork and sppoon workings of these 'complimacated' components, but for the love of god, you don't actually have to tell us your little madey-up special explanations of how you 'think' this product works. I apolagise profusely to any normal people who've been forced to read through tomy's 'my first video card' spam here.


  • Registered Users, Registered Users 2 Posts: 3,280 ✭✭✭regi


    There was no gear at home then dave?


  • Closed Accounts Posts: 2 Flea


    From the AskHook column on VoodooExtreme:

    September 15, 1999 - (5:00am MST)

    Brian,

    In light of the recent announcements by Nvidia and S3 and many articles floating around regarding hardware T&L I noticed that a lot of people say that "any game which uses OpenGL will automatically benefit from hardware T&L, but games which use DirectX need to use DirectX transform and lighting engine." In a reply you recently made you said "a lot of programmers still insist on writing their own transformation routines". So are you saying that this applies to OpenGL as well as DirectX, meaning some OpenGL games may use a developers own transformation routines and hence receive no benefit from onboard T&L processors?

    Rees




    Rees,

    OpenGL provides a full featured 3D transformation, lighting, and rasterization pipeline. If this entire pipeline is taken advantage of then an application will, in fact, automatically make use of hardware T&L. However, it is possible for some applications to put OpenGL into a "rasterization-only" mode where the transformations and lighting are effectively disabled (by setting the modelview matrix to identity and the projection matrix to ortho). In this mode of operation the programmer sends screen space triangles (a la Glide) directly to OpenGL, in which case there will be no hardware accelerated transformation and lighting.

    So it really depends on how the programmer wrote to OpenGL. If they opted to take advantage of OpenGL's transformation and lighting capabilities then they will automatically be sped up on hardware T&L. If they bypassed this mechanism then hardware T&L will not offer any benefit.

    August 11, 1999 - (6:30am MDT)

    Dear Brian,

    What's the deal with T&L that's done onboard and how they are used by API's. IE: for a game to use of onboard T&L extensions in OpenGL does it need to specifically be written to make use of those OpenGL extensions? Or does OpenGL figure that out already and the 3d card manufacturer just has to include it in their OpenGL driver?

    If you could fill us all in with a little more on onboard T&L info that would kick ass.

    Aaron

    Aaron,

    Hardware transformations and lightings do not need OpenGL extensions if an application was written to use OpenGL's native transformation and lighting capabilities. Good old GLQuake supports hardware T&L on, say, Intergraph workstations with hardware T&L. No special code needed to be written, just the regular API was supported.

    The same is somewhat true of D3D, although the engineers at Microsoft have had some "problems" getting a hardware T&L driver spec sorted out for the past few years.


  • Registered Users, Registered Users 2 Posts: 3,280 ✭✭✭regi


    niiiii


  • Registered Users, Registered Users 2 Posts: 166,026 ✭✭✭✭LegacyUser


    Oh my god ! Things are kind of getting out of hand here. I mean, Tincool only asked for opinions on the PRICING of a card which obviously (being newer and all helps) is more powerful than TNT2 or V3 , and at that price it deffo should be better.
    So give the man yer opinions on THE PRICE vs PRODUCT and NOT how the FU<KIN HARDWARE WORKS in some vain attempt to look like ye know what it actually does bar run gl quake !


    SAD SAD

    [This message has been edited by OmEnBoY (edited 05-12-1999).]


  • Registered Users, Registered Users 2 Posts: 4,484 ✭✭✭Gerry


    I can get the ddr version ( thats the
    creative geforce pro) for 200 punts if
    anyones interested. I know I am.


  • Closed Accounts Posts: 392 ✭✭Skyclad


    Forgive me for not reading the reams of stuff thats gone before, butwhat exactly is the difference between DDR and SDRAM pls? one line answer much appreciated


  • Registered Users, Registered Users 2 Posts: 3,537 ✭✭✭SickBoy


    SDRAM = Super Duper Random Access Memory
    DDRAM = Dynamic Duper Random Access Memory

    ok no,

    SDRAM = Static Dynamic Random Access Memory and DDR = Double Data RAM or something like that.

    Jimmy...


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 20,099 ✭✭✭✭WhiteWashMan



    i presume....

    sdram = syncronous dynamic random access memory

    ddram = duel duynamic random access memory

    actually i made the duel bit up[, but it sounds good smile.gif

    better than super duper anyway wink.gif


  • Registered Users, Registered Users 2 Posts: 3,537 ✭✭✭SickBoy


    I'm 99.99999999999999999999% sure that SDRAM means Static Dynamic Random Access Memory as I said above. Someone correct me please.

    Jimmy...


  • Closed Accounts Posts: 2,972 ✭✭✭SheroN


    you're correct jimmy.......i know...coz im the bestest


  • Registered Users, Registered Users 2 Posts: 3,537 ✭✭✭SickBoy


    Haha WWM smile.gif take it u b!t(h smile.gif
    Now an explaination on DDR

    Q: What is DDR memory ? How does it help increase the graphics performance ?
    A: DDR (Double Data Rate) memory is embedded with the latest memory technology that doubles the frequency of data fetching. Traditionally, each memory cycle is triggered by either positive or negative edge of a clock. DDR technology enables both positive and negative edges of a clock to initiate the memory cycle, making the memory data bandwidth approximately twice as higher.
    By utilizing DDR memory in a graphics subsystem, the speed of buffering and texturing can be greatly increased to release the bottleneck between the graphics processor and the memory. In an advanced graphics subsystem such as nVidia GeForce 256, instead of using the PC's central processor, massive graphics computation takes place at the on-board GPU(Graphics Processing Unit) and its memory buffer; freeing the bottleneck between the graphics processor and the memory gets particularly important in order to achieve higher performance.

    There you go monkeys smile.gif

    Jimmy...

    [This message has been edited by SickBoy (edited 06-12-1999).]


  • Closed Accounts Posts: 2,313 ✭✭✭Paladin


    You are 99.9999999999999999999% wrong about SDRAM so SB :-)
    Synchronous Dynamic RAM
    How could it be both Static and Dynamic????
    Static means not moving ie neednt be refreshed all the time making it faster
    Dynamic means being refreshed all the time because it useds internal capacitors which constantly discharge
    SRAM is static RAM tho
    DRAM is dynamic RAM
    SSRAM is synchronous static RAM :-)
    Does that clear up the RAM issue? :-)


  • Registered Users, Registered Users 2 Posts: 3,537 ✭✭✭SickBoy


    I'm lodging an appeal sunglasses.gif

    Jimmy...


  • Registered Users, Registered Users 2 Posts: 20,099 ✭✭✭✭WhiteWashMan



    hahahahahaha

    now.. wheres me prize?


  • Registered Users, Registered Users 2 Posts: 488 ✭✭Sonic


    gerry im interested , i will have ur babies if u can get me one for that price


  • Registered Users, Registered Users 2 Posts: 3,537 ✭✭✭SickBoy


    yes me too. In my madness I forgot to mention that £200 has me hooked.

    Jimmy...


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 4,484 ✭✭✭Gerry


    Ok well give me a day or two and I will
    have them. Email me at
    newfoundwealth@hotmail.com to
    sort out the details


  • Registered Users, Registered Users 2 Posts: 2,021 ✭✭✭ChRoMe


    round up of all the geforce 256 cards
    http://www.ixbt-labs.com/video/geforce256-roundup.shtml

    I think these are the ones with the **** memory

    ChRoMe


  • Registered Users, Registered Users 2 Posts: 20,099 ✭✭✭✭WhiteWashMan



    i thought you 'retired'?

    bugger off


Advertisement