Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Using Your GPU For Something Else?

  • 02-04-2007 1:09am
    #1
    Registered Users, Registered Users 2 Posts: 15,046 ✭✭✭✭


    Hey Folks,

    I remember a while back that ATI/nVidia realised the untapped potential graphic card cpu's had. Powerful processor's that were only really being used when a game was being played and laying idle most of the time.
    I remember hearing that people were starting to develop software/programs that the GPU could work on and take some of the load off of the CPU. Has anyone any information on this? Has anything happened? I've googled it and the information returned has been patchy at best or wayyyyy too technical for my simple mind to comprehend. I had a look on wikipedia and it stated that both ATI and nVidia had realised the potential of GPU computing, but a search of ATI's website at least has returned very little.

    Does anyone have more information, interesting links to read or know of any applications that have been released? It's an interesting idea and something that is sure to become common place in the not so distant future.


Comments

  • Registered Users, Registered Users 2 Posts: 3,969 ✭✭✭christophicus


    ATI are working on using a 3rd ATI graphics card as physics processer, or at least they were before being taken over by AMD, I assume they stilll are. I think nVidia are aswell.


  • Closed Accounts Posts: 7,145 ✭✭✭DonkeyStyle \o/


    folding@Home have a GPU client (ATI only)... crunching numbers for medical science.
    From what I've read on their forums though, GPUs are great at only a certain kind of computing and aren't versatile enough to replace normal CPUs.


  • Registered Users, Registered Users 2 Posts: 15,046 ✭✭✭✭Kintarō Hattori


    ATI are working on using a 3rd ATI graphics card as physics processer, or at least they were before being taken over by AMD, I assume they stilll are. I think nVidia are aswell.

    Hmnnn I think I saw something like that alright, but that's not really the kind of computing I was thinking about. A 3rd ATI card just to crunch some physics in game seems a bit excessive.

    folding@Home have a GPU client (ATI only)... crunching numbers for medical science.
    From what I've read on their forums though, GPUs are great at only a certain kind of computing and aren't versatile enough to replace normal CPUs.

    This is definetly more along the lines of what I was thinking, it definetly seems more practical and useful. I wouldn't have expected them to replace traditional processors, but when you think about it GPU's have so much 'down time' it's shocking. They are in so many cases more expensive than the CPU and they are increasing in power so much and so quickly.
    I had heard the whole GPU processing thing about two year's ago, I really thought it would have been at a more advanced stage than it seems to be.


  • Closed Accounts Posts: 4,757 ✭✭✭8T8


    To boil it down very simply GPU's are highly specialized pieces of hardware dedicated to executing specific actions very very fast and are inherently parallel in nature not at all like CPU's.

    This means harnessing the power of a GPU is no trivial task as you have to bypass a lot of junk (e.g you can't use API's like Direct3D or OpenGL) which means getting close to the metal. On top of that the parallel nature of the hardware makes it unsuitable for many tasks and suitable for some so it won't be very good at some types of data processing.

    For example with H.264 video decoding only partial amounts are actually done on the GPU as other parts of the H.264 video decoding process are not suited so the CPU is still in fact doing those chunks.

    To combat the API problem both ATI & NVIDIA have their own systems called CTM for ATI an CUDA for NVIDIA but these are both tied to their own respective architectures which means you cant just port one program over to the other (not without a lot of work) you have a vendor tie in.

    Also there is the issue that GP-GPU (General Purpose Graphics Processing Unit) the term given for this field is still relatively new and only the latest generation of hardware is starting to actual make things easier for developers in this field. Future hardware post GeForce 8 and Radeon X2K will get a lot better at GP-GPU tasks and I predict Microsoft will eventually step in with it's own standardized API.

    The folding at home GPU client is one of the first high profile programs and tailored for ATI's products because only the X1K series has proper hardware to do it effectively, the GeForce 6/7 doesn't that & bugs NVIDIA needs to fix to make it work with the client which also apply to GeForce 8 but that series should be very good at it.

    Apple also make use of the GPU in there pro photo software aperture for processing images.

    The physics thing was a bit over blown both ATI & NV wanted to stop PhysX gaining any ground hence the marketing offensive but it does actually work however at the moment the only software to show off are tech demo's there are actually no products that make use of GPU physics.

    The biggest demand for GP-GPU will be in the industrial/medical/engineering sector as thats where it will be of use, not too much use in the home sector as far as I can see.


  • Registered Users, Registered Users 2 Posts: 15,046 ✭✭✭✭Kintarō Hattori


    Thanks 8T8, some interesting stuff in there. I'll look into folding @ home for the moment I think.


  • Advertisement
  • Closed Accounts Posts: 7,145 ✭✭✭DonkeyStyle \o/


    eo980 wrote:
    I'll look into folding @ home for the moment I think.
    Don't forget about the boards.ie folding team!
    http://www.boards.ie/vbulletin/forumdisplay.php?f=851


  • Closed Accounts Posts: 7,563 ✭✭✭leeroybrown


    The GP-GPU direction is one that is making inroads in High Performance Computing (HPC). The potential for GPU's to deliver an order of magnitude more FLOPS per Watt for specific HPC applications is driving development in this direction with Nvidia's CUDA, ATI/AMD's CTM and also third party proprietary stream computing offerings from companies like PeakStream and RapidMind. Essentially the idea is to use the GPU as an application accelerator and offload what runs well on its very specific cores to it.

    For general PC user purposes I think this question is probably being asked 12 - 24 months too early as with some notable exceptions this technology is still emergent.

    One good example of how this type of acceleration is being harnessed right now is the PS3 where the Cell BE processor uses one general purpose core and eight Synergistic Processor Elements (SPEs) that are used as accelerators.

    The best resource for more information is probably www.gpgpu.org.


  • Registered Users, Registered Users 2 Posts: 5,112 ✭✭✭Blowfish


    One good example of how this type of acceleration is being harnessed right now is the PS3 where the Cell BE processor uses one general purpose core and eight Synergistic Processor Elements (SPEs) that are used as accelerators.
    Its actually only 7 SPE's used on the ps3.

    As has been said, both the cell and GPU's are fantastic for HPC. Have a look here, it shows you the impact both of them have had on folding @ home. The cell is producing far more per processor than the Intel/AMD's, but even so the GPU's simply destroy them.


  • Closed Accounts Posts: 7,563 ✭✭✭leeroybrown


    Blowfish wrote:
    Its actually only 7 SPE's used on the ps3.
    I had forgotten that the PS3 had one disabled. On top of that on the PS3 a software developer only has access to six as the seventh is retained by the console for system purposes.


  • Registered Users, Registered Users 2 Posts: 3,969 ✭✭✭christophicus


    What ? Sony disabled one? so infact teh cell's in the PS3s are infact rejected that werent made properly?


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 5,112 ✭✭✭Blowfish


    What ? Sony disabled one? so infact teh cell's in the PS3s are infact rejected that werent made properly?
    It's not that they were rejected or not made properly, just sony decided to keep one spare just incase any of the others gave up.


  • Registered Users, Registered Users 2 Posts: 3,969 ✭✭✭christophicus


    o right so it is fully workinjg??


  • Registered Users, Registered Users 2 Posts: 5,112 ✭✭✭Blowfish


    o right so it is fully workinjg??
    Yep, I haven't seen any details on how/if/when it kicks in though.


  • Closed Accounts Posts: 17,163 ✭✭✭✭Boston


    8T8 wrote:
    To boil it down very simply GPU's are highly specialized pieces of hardware dedicated to executing specific actions very very fast and are inherently parallel in nature not at all like CPU's.

    CPU's are also inhernetly paralleled. There nothing about the architecture of GPU that makes it more suitable for parallel operations then a CPU. However because of the nature of a gpu design specific you get alot of functionality implemented in hardware on the chip, and with anything implemented in that fashion you can take greater advantage of the parallelism in your algorithms, where with a General purpose processor, functionality must be achieved at a higher level, and the high you go, the less parallelism you can exploit.


Advertisement