Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Do you think computer technology will level out?

  • 28-04-2006 11:25pm
    #1
    Moderators, Society & Culture Moderators Posts: 10,247 Mod ✭✭✭✭


    Ever since Personal computers came into existance the technology involved has moved at lightening speed; you can be sure that any machine you purchase is not really top of the range (even if it is good value), even people who custom-build machines find that there is always something a little bit better out there for their build and unless you're stinking rich for every component you upgrade another has become outdated (while not obsolete of course).

    Anyway, does anyone else think that we'll soon see a day that computer technology will level out? Most other technologies have, take television for example where each development since colour has been pretty small at the best of times, and while HD is of course a huge leap every other progression has been slow (stuff like teletext, remote control etc. had nothing to do with the actual quality of the tv display).. Anyway, is it likely that one day manufacturers will realise they just cant squeeze anymore gigs into their hard drives, can't drag any more power from their processers etc., where any further progress would involve the creation of bigger and bigger machines? On the other end are we ever likely to reach a point where customers, personal and professional, no longer need more storage space, better graphics cards etc.? Some believe that the gaming industry, for example, is on a collision course with itself and that a brick wall in realism is sure to be hit at some point (or that any game that is more vast or realistic would be far too expensive to create).

    Obviously technology of any description will always be progressing, but does anyone think that it will slow down in the computer world, where each year only brings one or two real improvements in machinery rather than the hundreds that currently occour?


Comments

  • Closed Accounts Posts: 3,494 ✭✭✭ronbyrne2005


    obviously processors and hard drives will be limited by the physical properties of the materials used untill quantum computers are a reality. people seem to want more the more they get though! i have a basic dell pc now with only 512mb 80gb 2.6ghz but never could have envisaged needing it 5 years ago,now i think i need much more but this one does the job for now.

    theres enough in most machines to satisfy most consumers day to day needs,but theres always gonna be requirements for more as new technologies/gadgets arise.


  • Registered Users, Registered Users 2 Posts: 5,112 ✭✭✭Blowfish


    I'd say that it is more likely that they will come up with more and more sophisticated ways of getting around the physical limitations, so that it wont be limited in that way. At a rough guess though I would say that technology would be more likely to stagnate through lack of need of some parts. Hard drives simply will end up large enough to hold everything it's needed to hold. In reality, for most 'normal' users they are already far larger than they actually need to be. The only home users that would use that kind of space would generally be gamers, warez monkeys, or those with some specialised interest. RAM will probably go the same way, and hopefully flash memory will end up large enough and quick enough so that hard drives can be done away with.

    Processors will keep improving for a lot longer, mainly because they are still crap at doing certain things that the human brain does with no effort whatsoever (eg. image recognition).

    There are still some areas which probably are yet to be invented yet. Graphics cards are a relatively new concept, and physics cards have only just begun.


  • Closed Accounts Posts: 38 OptimusMime


    It already has levelled out in some respects, if you look at Intel's roadmap from a few years ago they expected to be at well over 10 ghz clockspeed by now but they hit a major brick wall with manufacturing processes that was not able to produce their targets, and Moore's law dictates the size of transistors should be cut in half every 18 months, which should double clockspeed.

    but if you look at the processors you can buy today and for the last few years, there has been an effective speed limit of 3Ghz (for teh Intels) for a long time, only in the last year or so are there faster than 3ghz chips, but not by much, it seems to have levelled out at 3.6Ghz for a while now too. Solving this by creating new manufacturing processes has cost Intel a lot of both money and more importantly time, in the form of market share; they are getting hammered in the US by AMD.

    A lot of these kind of problems are down to the process of manufacture as I said, but there is definitely a culture of lazy profit making too that is slowing down innovation. It's just too hard for companies with shareholders to take the financial gamble of really pushing technology forward when it is so easy to just knock out the same old stuff and still reap profits.

    For example, as Blowfish says, HD's made of solid state memory would be brilliant and very useful to everybody (unlike a lot of fancy new stuff that would only really be of use to the hardcore users), but the technology would be quite expensive to start off with, so no-one has bothered to try and make it standard yet. So the manufacturers just knock out the old 3.5 inch HD's which, at this stage, are cheap as chips to make and represent a relatively small finacial risk.

    the above idea can be seen in all forms of (not onlycomputer!) manufacturing, for example, where are the 32GB SD cards that were talked about a couple of years ago?? It's far less financial risk to just knock out 256MB to 2GB cards and gouge the consumer by charging too much for these small capacities than actually trying to create bigger capacity memory cards.

    To be honest, I think tech will always be pushed as everyone wants bigger, better, shinier stuff. But I here's where I think the mass market should go:

    Slower, better produced computers. !

    Yes, slower. I'm no Luddite, I just think slower, smaller and more importantly cheaper computers would be just as much use to most people than the ultra fast stuff that Dell etc.. would sell (even Dell's slowest computer is overkill for most people)

    Think about it, who uses these 3ghz chips to their full? Apart from a minority of users which would likely include anyone who is readin this in the nerdy section of a site for nerds (come on, admit it, embrace your inner nerd) not too many people do.

    "most" people do not do high end CAD work, do not render HD movies regularly, heck, most people do not play games apart from the odd bit of Minesweeper or Solitare.

    "most" people use computers to surf the web, check emails, do a bit with MP3's and do a bit with digi pictures. All you need to this kind of stuff is 1ghz and you'll be fine.

    A 1ghz with a fast FSB, 1GB ram, with a solid state HD would be perfect for most people especially if the processor ran fanless (with modern heatsink tech that should be no problem.)

    Maybe I should start my own comuter company, sure didn't Mikey Dell himself start in his garden shed ;)


  • Closed Accounts Posts: 453 ✭✭nuttz


    It's not going to level out soon anyway, you just have to look at headlines like this to realise that:
    Intel to offer new architecture every two years

    Applications are as hungry as the hardware will allow, Java apps running in a virtual machine, using huge amounts of RAM are an example of this. Java is a success because the advances in the hardware allowed this. If we were still struggling with 64mb of RAM Java wouldn't be where it is today.

    My point being hardware will improve but at the moment it looks like there are plenty of apps out there looking to use all the resources the can on your machine.


  • Closed Accounts Posts: 1,444 ✭✭✭Cantab.


    I'd say OptimusMime is probably right about the slower, smaller and cheaper computers coming on stream. Aren't google at advanced stages of developing such a machine that has a solid state hard disk, robust 3d graphics and full internet connectivity? I'd say this will be quite a revolution to the PC market.

    Nano RAM looks like it will be a reality one day too replacing all those nasty high-precision moving parts in today's hard disk drives.

    The move to multi-core processing has well begun. I would say that speeds have maxed-out for silicon at around 4 GHz. Mind you there will always be engineers who will claim higher under laboratory conditions. The problem with tiny transistors is they leak current hence dissapate huge power and overheat making them useless for large-scale integration. We either go multi-core or else we need a whole new technology (optical processing etc.). I'd say we'll have arrays of microprocessors integrated on a single silicon wafer in the future.

    Of course if there is no software to utilise all this processing power, there's little point to the advancement.

    I believe ultra low-power wireless sensor networks are to going to be huge. Energy scavenging technology has really advanced tremendously and we can now power ultra low-power devices from the sun, vibrations, temperature gradients, long-life batteries etc. Bluetooth technology is brilliant and is optimised for low power consumption and already we are reaping its benefits.


  • Advertisement
  • Closed Accounts Posts: 2,279 ✭✭✭DemonOfTheFall


    but if you look at the processors you can buy today and for the last few years, there has been an effective speed limit of 3Ghz (for teh Intels) for a long time, only in the last year or so are there faster than 3ghz chips, but not by much, it seems to have levelled out at 3.6Ghz for a while now too. Solving this by creating new manufacturing processes has cost Intel a lot of both money and more importantly time, in the form of market share; they are getting hammered in the US by AMD.

    That analysis isn't really fair, since you're only looking at clock speed. The actual performance of processors has still increased a decent bit over the last few years, even though clock speed has been steady.


  • Registered Users, Registered Users 2 Posts: 10,245 ✭✭✭✭Fanny Cradock


    It already has levelled out in some respects, if you look at Intel's roadmap from a few years ago they expected to be at well over 10 ghz clockspeed by now but they hit a major brick wall with manufacturing processes that was not able to produce their targets, and Moore's law dictates the size of transistors should be cut in half every 18 months, which should double clockspeed.

    but if you look at the processors you can buy today and for the last few years, there has been an effective speed limit of 3Ghz (for teh Intels) for a long time, only in the last year or so are there faster than 3ghz chips, but not by much, it seems to have levelled out at 3.6Ghz for a while now too. Solving this by creating new manufacturing processes has cost Intel a lot of both money and more importantly time, in the form of market share; they are getting hammered in the US by AMD.

    A lot of these kind of problems are down to the process of manufacture as I said, but there is definitely a culture of lazy profit making too that is slowing down innovation. It's just too hard for companies with shareholders to take the financial gamble of really pushing technology forward when it is so easy to just knock out the same old stuff and still reap profits.

    For example, as Blowfish says, HD's made of solid state memory would be brilliant and very useful to everybody (unlike a lot of fancy new stuff that would only really be of use to the hardcore users), but the technology would be quite expensive to start off with, so no-one has bothered to try and make it standard yet. So the manufacturers just knock out the old 3.5 inch HD's which, at this stage, are cheap as chips to make and represent a relatively small finacial risk.

    the above idea can be seen in all forms of (not onlycomputer!) manufacturing, for example, where are the 32GB SD cards that were talked about a couple of years ago?? It's far less financial risk to just knock out 256MB to 2GB cards and gouge the consumer by charging too much for these small capacities than actually trying to create bigger capacity memory cards.

    To be honest, I think tech will always be pushed as everyone wants bigger, better, shinier stuff. But I here's where I think the mass market should go:

    Slower, better produced computers. !

    Yes, slower. I'm no Luddite, I just think slower, smaller and more importantly cheaper computers would be just as much use to most people than the ultra fast stuff that Dell etc.. would sell (even Dell's slowest computer is overkill for most people)

    Think about it, who uses these 3ghz chips to their full? Apart from a minority of users which would likely include anyone who is readin this in the nerdy section of a site for nerds (come on, admit it, embrace your inner nerd) not too many people do.

    "most" people do not do high end CAD work, do not render HD movies regularly, heck, most people do not play games apart from the odd bit of Minesweeper or Solitare.

    "most" people use computers to surf the web, check emails, do a bit with MP3's and do a bit with digi pictures. All you need to this kind of stuff is 1ghz and you'll be fine.

    A 1ghz with a fast FSB, 1GB ram, with a solid state HD would be perfect for most people especially if the processor ran fanless (with modern heatsink tech that should be no problem.)

    Maybe I should start my own comuter company, sure didn't Mikey Dell himself start in his garden shed ;)

    well written! though i'm no nerd.
    ok, off to watch star trek now:D !


  • Moderators, Education Moderators Posts: 2,432 Mod ✭✭✭✭Peteee


    for example, where are the 32GB SD cards that were talked about a couple of years ago??

    It's not an SD card but

    http://www.engadget.com/2006/04/07/kangurus-64gb-flash-drive-max-only-2-800/


  • Closed Accounts Posts: 850 ✭✭✭DOLEMAN


    As programmers continue to get more lazy and less talented, apps will continue to grow and so will the hardware needed to power these apps.

    1980s - 64k ram. Assembler.
    2006 - 1048576k ram. JAVA / C#.
    What's next?


  • Closed Accounts Posts: 17,208 ✭✭✭✭aidan_walsh


    Lazy and less talented? I suppose you think that applications from the 80s are on par with applications from today, functionality wise as well?


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 7,722 ✭✭✭maidhc


    Apart from a few games, I'm not aware of any application that puts any pessure on an entry level celeron with 512mbs of ram.

    I remember when a 12 mth old computer was utterly obsolete. Now a 5 year old computer is adequate for most tasks.

    I think things have leveled out a lot.


  • Closed Accounts Posts: 2,808 ✭✭✭Dooom


    DOLEMAN wrote:
    As programmers continue to get more lazy and less talented

    I'm speechless.


  • Registered Users, Registered Users 2 Posts: 17,165 ✭✭✭✭astrofool


    The move is definitely towards languages like C# and Java, Intel have a division working on doing the compilation of these languages in hardware, which could have a good effect.

    These languages remove some of the big problems that programmers can create when writing a program, but at the expense of performance.

    Also, things have levelled out really because there hasn't been a new Windows for five years. Given that the top PC's at the time of XP's arrival ran it perfectly then, the trickle effect has meant that all PC's nowadays can do the same. Vista will push the requirement up again, because thats what new OS's do (providing they're using that extra power effectively).

    Its also led to a stagnation in the RAM market imo, as when XP was released, 512 ram was the sweet spot at the time and was affordable, yet even now Dell still sell machines with 256 with 512 being the standard, and only now has the market moved towards 2gb.


  • Closed Accounts Posts: 4,943 ✭✭✭Mutant_Fruit


    As programmers continue to get more lazy and less talented
    Whilest C# is slower than assembly and C, it's much faster to develop in.

    In C# I could write an app that connects to the internet, downloads an XML file (with a progress bar) from a given URL, parses it and displays it in an editable grid onscreen in under 15 minutes. That'd be including 5 minutes testing, and writing error handling.

    The same thing in C would take quite a bit longer to write. The same thing in assembly would take forever :P So, in the time a talented C programmer could whip up the basic app, i could write it in C# with extra functionality/less bugs.


  • Registered Users, Registered Users 2 Posts: 3,312 ✭✭✭mr_angry


    I think that the emphasis will shift from the current focus on storage space and processing power, at least for consumer electronics. Obviously business will always be egar to get the maximum efficiency possible per unit space / power. For ordinary consumers, I think there are several major advances which might just come out in our lifetime that will make significant differences. The most obvious one is the availability of ubiquitos wireless networking anywhere, anytime. We're already seeing city-wide networks going into operation, so its reasonable to assume that work-arounds for lesser-populated areas will be discovered over time.

    Secondly, there is wireless power. This has been discussed in the industry for some time, but research is at a very, very early stage. The benefits are pretty obvious, but I reckon we could all be long dead by the time it arrives.

    Finally, there is holographic display. Being able to create a high-definition, high-resolution visual representation without wiring or bulky hardware would be a major leap ahead. All of a sudden your iPod video isn't limited to a poxy little screen. Whats more, if you could engineer touchscreen-like feedback into the system, it would be incredible. I've seen prototypes that work with jets of water, but nothing so far that even comes to Jedi-standard technology (and that always had a dodgy look about it). Again, this is a long way off. With luck, you might look on it one day the same way your grandparents look at mobile phones.


  • Registered Users, Registered Users 2 Posts: 17,371 ✭✭✭✭Zillah


    Normal electronic computers will level out eventually. Microchips can only get so small.

    However, there will be next generation computers based on entirely different principles, any of which have the potential to be thousands of times faster than current technology.

    For example, quantum computers do calculations on a small handful of atoms. A single quantum computing core could easily provide for the entire planets current computing needs. This technology is far from ready yet.

    Theres also molecular computing, which uses chemical logic gates instead of electrical, which while not as insanely powerful as quantum, they're still leagues and leagues ahead of current tech.


  • Registered Users, Registered Users 2 Posts: 8,067 ✭✭✭L31mr0d


    I think the actual performance of cpus will see a lull, unless, like was said, they crack quantum computers and find a way of viably applying it. But I think the actual versatility of them will increase. Their size will decrease, you'll get laptops with the performance of present computers, and they'll be cheap, less than €600 for top of the line and €100 for the basic ones, kids will bring them to school instead of notepads, with all their reading material loaded.

    CPU speeds are fast enough right now for most computer needs (i still don't think people fathom how fast a gigahertz actually is, how many oscillations is that per second?) Instead companies will concentrate on removing any other bottlenecks in a computer system, such as the HDD, moving parts in a computer is simple ridiculous nowadays. The internet will be redesigned so that its speed can comparatively match the speed of moving files within your own system (web 2.0) Everyone will have a computer of some sort on their bodies at all times, and will always be connected to the internet. Computers will be integrated into everything allowing complete control, such as defrosting the freezer, putting out the catfood, etc

    and then Skynet will go online, we all know what happens next.


  • Moderators, Education Moderators Posts: 2,432 Mod ✭✭✭✭Peteee


    L31mr0d wrote:
    CPU speeds are fast enough right now for most computer needs (i still don't think people fathom how fast a gigahertz actually is, how many oscillations is that per second?)

    A billion
    L31mr0d wrote:
    Everyone will have a computer of some sort on their bodies at all times

    Called a mobile phone ;)

    But this is all moot, 640k should be enough for anyone!


  • Registered Users, Registered Users 2 Posts: 68,317 ✭✭✭✭seamus


    I said this somewhere else - There will be a day where the idea of the "PC" will be effectively dead. The major shift will push towards terminal-style computing. That is, the bulk of the work is done by a server, somewhere. All your machine has to do is render the images and send the clicks.

    Even in a household environment, a single server machine can/will be the "PC". To plug in, you log on from your bedroom terminal (which is a small screen with a hefty GFX card, a small processor and some limited storage space) and work away. At the same time, your son is downstairs playing Land of Warcraft on the study terminal. One machine, multiple users simultaneously (and even multiple OSes).

    I don't ever see it "levelling out". It's tough to compare the development of any technology to that of computing, because the need simply hasn't developed. But computing has effectively dragged a lot of the established stuff along with it. Back in the 1980s, a TV with a remote and more than 10 tuneable programs was a bit posh. Now a remote and 100 programs is the bare minimum we accept.
    Take any technology which was invented pre-1970s, and see the explosion in evolution from the late 1980s on. Cars are a great example. Functionally they've been the same for 50 or sixty years. But if I were to buy a car from 1990 tomorrow, I'd feel like I was driving around in a dinosaur. If I bought a car tomorrow that didn't have a light that went on when I turned off the engine, or electronically adjustable headlights, or ABS, or Power Steering, I'd feel ripped off.

    It's all about what we think we need. If the next technology can successfully push itself into the breach, then we'll all buy our upgrades to handle it. How many people knew what an mp3 was in 1997? How many people today would be shocked if their computer couldn't play them?


  • Registered Users, Registered Users 2 Posts: 17,371 ✭✭✭✭Zillah


    Why would each terminal get a hefty GPU? Surely a central GPU would work if a central CPU et al is sufficient.


  • Advertisement
  • Closed Accounts Posts: 29,930 ✭✭✭✭TerrorFirmer


    I think at the moment it's reached a stage where if you buy a top of the range PC today, it'll last you for years. That definately wasn't the case when I got my first 166mhz PC that cost over a grand in pounds. My current computer is well over 14 months old at this stage and is still running fine, even for current games....3.4ghz and 1gig ddr2 ram.


  • Closed Accounts Posts: 7,563 ✭✭✭leeroybrown


    Like Seamus I don't ever see it leveling out. We eased off in terms of raw CPU power in the last three years because Intel (and to a lesser degree AMD) hit a technological impasse that had not been properly predicted. Interestingly, the MHz bottleneck was offset greatly by improvements in caching, memory bandwidth and memory latency resulting in much increased performance.

    While some say that Moore's Law no longer true, it many respects it still holds true to it's original intended meaning. The clock speed of micro-processors may have hit a process ceiling but the number of transistors (components) on die is holding true to his prediction. Improvements in CPU design have led to larger cache memory (L1, L2 and L3), more efficient processing and multi core CPU's. The AMD64 architecture has moved slightly towards a micro-controller like architecture by chosing to place the memory controller on the CPU die. This is a major departure from the old Intel 'faster is better'model but when you think about it they deliver even more of an improvement.

    In terms of the density of the parts we have moved from 130nm to 90nm and soon there will be 65nm production taking place in Ireland. There is already laboratory fab-ing taking place well below that. Companies like Intel and IBM are actively looking at how to deal with the heat bottle necks with in the CPU. A well designed and physically laid out ALU can make a world of difference to scalability. IBM 's Power6 has a 6GHz roadmap on multiple cores and while it will mostly be running HPC gear and not home PC's it still proves that there is scalability left given the right circumstances.

    You could say that the only people really making use of their fast CPU's are Gamers, renderers, encoders, etc but I'd disagree. These are dedicated purpose tasks that require great performance for one thing but most of the time I want my PC to do a lot of relatively lightweight things simultaneously without any loss of performance. It's not about putting pressure on the PC it's about it performing well for as much time as possible.

    If you compare modern applications to those of five or ten years ago you will see that you are looking at much more heavyweight products that do a lot more and provide (for the most part) a better user experience. As pointed out above this is added to by the modern focus on fast development, automated error handling, completely automated memory management, etc but a lot of the improvements in functionality would never have come about without these. The web browser of today is a lot more complex than that of five years ago and requires a lot more processing power to run well.

    Ultimately the drive forward for increased performance will come from the same people like Gamers and those who use PC's as workstations. The Game developers will continue to push for more and more realism and immersion and are more likely to be held back by game development tools/methodologies (eg. how to properly address the problem of multicore processing) than hardware. I work in High Performance Computing and I don't ever see a day when I say that we have enough compute power. Areas like these will drive the leading edge which will as usual result in the market's consumer product being something that is very powerful.

    The differences of the future are already here to be seen. Both AMD and Intel are moving towards 'platform architectures' as was first seen with 'Centrino'. There are already loads of dual core systems out there and quad core will be here soon. Power consumption has suddenly become important - especially in my area of work.

    Ultimately there is also the tradition loop of hardware and software developer where each tries to move onwards in turn pushing the other. It's a very important economic model for them.


  • Moderators, Education Moderators Posts: 2,432 Mod ✭✭✭✭Peteee


    seamus wrote:
    I said this somewhere else - There will be a day where the idea of the "PC" will be effectively dead. The major shift will push towards terminal-style computing.

    Like they had in the days of main-frame time sharing computers with dumb terminals???

    Theres definete cycles in computing, and in each cycle, there are always people who predict a return to the other way of doing things.

    I dont think dumb terminals will make a comeback, simply because it's not all that much more expenisve to make a somewhat powerful computer and a dumb terminal.


  • Registered Users, Registered Users 2 Posts: 68,317 ✭✭✭✭seamus


    Zillah wrote:
    Why would each terminal get a hefty GPU? Surely a central GPU would work if a central CPU et al is sufficient.
    Well, I'm actually thinking that GPU will take the bulk of the machine, and will have the most work to do. While the server itself will be more than able to handle the processing, the sheer volume of data and graphics processing necessary to render modern games wouldn't make pushing that data over the network to be feasible.

    I do think it's quite a ways away. My justification is in terms of the necessity of computers. It will become increasingly necessary for everyone to feel like they need their own computer - hell, mobile phones are quickly becoming small handheld machines. Thus, equipping an entire home with a computer for each person just isn't feasible. I don't know how it would actually work. At best, you would have a docking station at strategic points for those who use them. Then your handheld device acts as your "dumb" terminal - you slot it into the docking station, it sets itself up on the network, and you work away. While you're away from home, the handheld still has the capacity to perform the simple stuff - surf the web, read mails, compose simple word processing, etc.


  • Registered Users, Registered Users 2 Posts: 17,371 ✭✭✭✭Zillah


    seamus wrote:
    hell, mobile phones are quickly becoming small handheld machines.

    Its a good point. The line is begining to blur, I wouldn't be suprised if it vanished all together. Mobiles used to make calls. Walkmen had music and maybe radio. Then there were mp3 players, then iPods, now mobiles are incorporating all of it aswell as web and email. Laptop and Pc and mobile will eventually be near identical devices, or likely the PC will vanish altogether.


  • Registered Users, Registered Users 2 Posts: 1,275 ✭✭✭bpmurray


    I think the whole thing will continue along the current lines, although new flavours of machines will come along. We have the monster machines still, with their super-computing calculations and visualizations, although with the new cell chips they're getting physically smaller; the pressure on the power of the server is constantly increasing with more demands on functionality; the desktop continues and its requirements will escalate with MS Vista; the personal device is increasing in power, with phone, PDA and games machine in one.

    Then there are the new "sensor dust" kinds of machines which establish adhoc networks to send data to wherever they're told, and these will become embedded in clothing (cool - they'll be able to tell you when it's raining!), used to monitor environment conditions in water, air, etc.

    All of these demand more and more sophistication from the programmers. While the old basic functionality is being created in libraries and similar, the real meat of the new programs will require smarter and better educated programmers.

    One main thing to note is that the world of the future will be awash with bandwidth: current gigabit networks will be scoffed at in 10 years. Even remote places like Lough Derg will have enormous connectivity. However, I'm also will ing to bet that performance won't improve because of the excessive bells & whistles added to software that'll make the incrastructure grind to a halt.


  • Moderators, Recreation & Hobbies Moderators, Science, Health & Environment Moderators, Technology & Internet Moderators Posts: 93,581 Mod ✭✭✭✭Capt'n Midnight


    SKY dishes have been using electronics running at over 10GHz for ages now. But even if a speed limit is found there is still parallel processing. 64,000 processors are faster than one.


Advertisement