Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Amd Vs. pentium

13

Comments

  • Registered Users, Registered Users 2 Posts: 17,471 ✭✭✭✭astrofool


    No it's not, check out the articles for it. Solid 30w at full load.

    EDIT: http://www.lostcircuits.com/cpu/amd_venice/

    8.4 watts idle for the 4000+, 29.5 watts while doing 3d rendering.
    I think it rises to around 45 watts when overclocked to 2.8 ghz?

    Now that's low power consumption / heat output

    The numbers used in that article are completely out of whack from every other site out there. For instance here:

    http://www.gamepc.com/labs/view_content.asp?id=lpcpuso&page=3&cookie%5Ftest=1

    it has the winchester consuming 65watts more than the pentium-m, which given the results you posted would give a wattage of -35w for the pentium M, which would be kinda cool, as we could get rid of oil, and power our cars by multiple laptops :)
    dearg_doom wrote:
    Intel's can encode faster because they have a higher FSB (800MHz or 1066MHz in the latest ones) and HT support, AMD have only 400MHz and no HT support.

    The Athlon 64 doesn't have a front side bus as such, as it has an on die memory controller. The nearest way it can be measured is the HT (hypertransport, not threading) bus, which runs at 200x5 on the latest ones, giving an 8gb/s bandwidth to components besides the memory. The memory then has its own dedicated 6.4 gb/s bus (dual channel pc3200 ram), but is really limited by the ram it uses, and it seems to be more affected by latency then bandwidth anyway. I guess if you wanted to, the internal memory controller to chip bandwidth could also be used (its the number intel normally quotes), in which case it'd probably be 20g/bs+ at a guess.

    As the ps2/3 and xbox360 should teach people though, big numbers mean nothing unless put into practical use, and the reason Intel is/was so good at encoding is probably a good branch prediction unit, double pumped ALU's, and a good Intel optimised compiler (AMD are embarrassingly behind in compiler tech).

    One nice thing about the athlon 64x2's btw, is that they set the HT (threading in this case), flag to true, so can take advantage of any programs written to support HT.

    I guess at the end of the day, the athlon 64 is far better at games, and very fast at everything else, and suffers a little when encoding, while Intel suffers at gaming, is decent at everything else, but can be smoother due to HT, and are very good at encoding. Either chip however, at the top end chip wise, will still be a very fast and usable chip, even Intel at gaming gets very very playable scores, and the graphics card makes all the difference here.

    Going to the future, the a64x2's fix what the ath64 isn't the best at, but cost alot, while the p4x2's are hot, and slow (for now, 3.2 was fast nearly 2 years ago now...).


  • Closed Accounts Posts: 4,943 ✭✭✭Mutant_Fruit


    astrofool wrote:
    it has the winchester consuming 65watts more than the pentium-m, which given the results you posted would give a wattage of -35w for the pentium M, which would be kinda cool, as we could get rid of oil, and power our cars by multiple laptops :)
    Aye, but we're not talking about winchesters :p We're talking about our beloved Venice's. Totally different kettle of fish.


  • Closed Accounts Posts: 2,279 ✭✭✭DemonOfTheFall


    Astrofool are you trying to troll ? I've said it about 4 times now. VENICE CORE VENICE CORE VENICE CORE!

    VENICE CORE!!!!

    Now that that's been said..... Please read my posts where I say that I'm talking about the venice core. There was a massive efficiency improvement over the winchester (Which was pretty efficient anyway).

    Your link explicitly says winchester, i've explicity said i'm talking about venice.


  • Registered Users, Registered Users 2 Posts: 17,471 ✭✭✭✭astrofool


    Sorry, my mistake, there's as many articles about claiming different results tho:

    http://www.xbitlabs.com/articles/cpu/display/athlon64-venice_6.html

    a good sum up here:

    http://www.pcmoddingmy.com/forum/index.php?showtopic=1515

    Most sites i've looked at have the venice consuming very slightly more than winchester. That LostCircuits one seems to be the anomaly of them all, giving vastly different results. I wonder if they had cool n' quiet enabled?


  • Closed Accounts Posts: 2,279 ✭✭✭DemonOfTheFall


    Hmmm, maybe they did. Cool n Quiet only kicks in idle though, it doesnt make any difference at full load. Interesting articles, cheers for the links. I'd be fairly trusting of lostcircuits, since theyre a decent site which isn't in the pockets of the manufacturers unlike many others (cough tom's hardware cough 3dgameman cough). Very interesting findings by xbitlabs...

    From personal experience though, i'd have to say the venice definitely makes less heat than the winchester. My winchester 3000+ with a xp-90 and fairly fast 92mm fan runs quite a bit hotter than my friend's venice 3000+ with a 7700-Alcu with the fan on lowest speed.

    Even with overclocks, mine tops out at 2.4 while his hits 2.8 and still stays cooler


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 17,471 ✭✭✭✭astrofool


    Thats dependent on so many things though, taking something with a sample of 1, doesn't really work with overclocking, you get lucky or you don't. I have a winchester myself atm, and it runs at 40c, with a zalman silent cooler on it, using arctic silver 5, i'm sure I could find another chip that would run at 60+.


  • Closed Accounts Posts: 2,279 ✭✭✭DemonOfTheFall


    I'd be amazed if you could find a winchester running at 60+. Absolutely amazed. Like the example i gave earlier, it took this guy's chip *5 minutes* to hit 60 degrees, even with the heatsink hanging off it.

    Check out a forum like www.hardforum.com and you'll see what i mean about the heat output differences between a venice and a winchester. theres an undeniable difference between the newcastle and winchester aswell, the newcastle is scorching.


  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    My mates venice 3000+ ran with almost no cooling for a good 5 minutes before it hit 55 degrees. there was a zalman 7700 supposedly on it but when i checked it i found that he'd left it barely screwed on and hanging half off. I'd be absolutely amazed if any other high power desktop chip could manage this feat.

    No offence but stop talking ****. No processor, even mobiles would run very well without decent contact with a heatsink. I severly doubt it was hanging off as you stated.


  • Registered Users, Registered Users 2 Posts: 25 MrFitt


    Well I have seen and tested Dual core Intels (3.2Ghz EE), Dell are shipping these in the U.S and due to ship in Ireland soon.
    Apple are migrating to the Intel chipset very soon.
    Quad core Intels are due out Q1 next year and they will be slashing prices of Dual core with a knock on effect to single core.

    I might be biased as I have only 'played games' on An AMD/Nvidia rig ( and never benchmarked them) but the new Intel Dual Core Extreme Edition/Crossfire Ati X850's are the proverbial dogs bollo*ks for games and video editing. FACT.

    Can't wait for the Quad core boards and cpus's - with HT enabled 8 virtual CPU's!!!!

    That's my pennies worth..... :)


  • Registered Users, Registered Users 2 Posts: 17,471 ✭✭✭✭astrofool


    MrFitt wrote:
    I might be biased as I have only 'played games' on An AMD/Nvidia rig ( and never benchmarked them) but the new Intel Dual Core Extreme Edition/Crossfire Ati X850's are the proverbial dogs bollo*ks for games and video editing. FACT.

    yes, all the released crossfire hardware has indeed proved this as FACT :rolleyes:


  • Advertisement
  • Closed Accounts Posts: 4,943 ✭✭✭Mutant_Fruit


    Quad core Intels are due out Q1 next year and they will be slashing prices of Dual core with a knock on effect to single core.
    .
    Are you serious? Before app's are ready to use dual cores, intel are skipping on to quad cores? I find that hard to believe. They'd be shooting themselves in the foot skipping past dualcore chips so fast. Hell, 95% of people don't have dual cores, why quad cores!

    If they are releasing em, it won't be for the average joe.

    People who end sentances in fact are usually making stuff up... FACT! :p


  • Registered Users, Registered Users 2 Posts: 25 MrFitt


    Quote:
    Quad core Intels are due out Q1 next year and they will be slashing prices of Dual core with a knock on effect to single core.

    Hey Mutant,

    I can only report what I have seen. I cannot reveal my sources, but Intel are going to be agressively pushing dual cores in the next few months and the Big Boys like Dell, IBM and HP ( eventually Apple too) will be shipping these in their millions.. Market forces will dictate the need for faster and cheaper cpu's. Intel and AMD are already restructuring their pricing lists. Dual core CPU's currently retailing at 950 US dollars will drop to 300-350 US in a few months.

    Nvidia and Inel already have motherboards ready for Dualcore/Quadcore.

    Fact: Technology moves faster than the consumer.
    Fact: Intel will wipe the floor with AMD when it comes to units sold.
    Fact: Indeed :D


  • Registered Users, Registered Users 2 Posts: 3,560 ✭✭✭SickBoy


    MrFitt wrote:
    Well I have seen and tested Dual core Intels (3.2Ghz EE), Dell are shipping these in the U.S and due to ship in Ireland soon.
    Apple are migrating to the Intel chipset very soon.
    Quad core Intels are due out Q1 next year and they will be slashing prices of Dual core with a knock on effect to single core.

    I might be biased as I have only 'played games' on An AMD/Nvidia rig ( and never benchmarked them) but the new Intel Dual Core Extreme Edition/Crossfire Ati X850's are the proverbial dogs bollo*ks for games and video editing. FACT.

    Can't wait for the Quad core boards and cpus's - with HT enabled 8 virtual CPU's!!!!

    That's my pennies worth..... :)
    I remember reading about the quad core chips and if I remember correctly, they will be initially xeon based for server/workstation which would mean that desktop based systems would remain dual core for a while yet.


  • Registered Users, Registered Users 2 Posts: 25 MrFitt


    :mad: Hey SickBoy!! Get back to work!!!


  • Closed Accounts Posts: 4,943 ✭✭✭Mutant_Fruit


    MrFitt wrote:
    Fact: Intel will wipe the floor with AMD when it comes to units sold.
    They do already... of course, that seems to be due to illegal market practices :p AMD only have about a 15% marketshare afaik. I'm just glad you said "when it comes to units sold" as compared to "when it comes to performance". The current crop of amd dualcores wipe the floor with the current crop of intel dualcores. Of course, the pricing reflects this :p


  • Registered Users, Registered Users 2 Posts: 17,471 ✭✭✭✭astrofool


    when Intel does move to quadcore, there won't be any hyperthreading, as they'll have moved to the pentium-M design by then.

    Dual core CPU's are already at the sub 300 level (pentium D 820)

    Also, my 'sources' (Dell, Intel, IBM), are not mentioning anything to do with quadcore yet, product release wise, seeing as they aren't releasing Xeon Dual cores till q1 '06, you may want to re-check the validity of your 'sources', even if they are just the random voices in your head.


  • Closed Accounts Posts: 3,354 ✭✭✭secret_squirrel


    MrFitt wrote:
    Quad core Intels are due out Q1 next year and they will be slashing prices of Dual core with a knock on effect to single core.

    That is widely believed to be a kneejerk reaction to AMD saying they were aiming for Quad core opterons in Q1 2006 (announced at computex). Since a week before intel said their quad cores were due in Q1 2007 I would assume its just spoiling and vapourware.

    If any Athlon or P4/P-m desktop quad core chip is available in retail before Mid 2006 I will eat my hat or another random piece of my clothing.

    At best there will be a few sampling versions.


  • Registered Users, Registered Users 2 Posts: 25 MrFitt


    astrofool wrote:
    when Intel does move to quadcore, there won't be any hyperthreading, as they'll have moved to the pentium-M design by then.

    Dual core CPU's are already at the sub 300 level (pentium D 820)

    Also, my 'sources' (Dell, Intel, IBM), are not mentioning anything to do with quadcore yet, product release wise, seeing as they aren't releasing Xeon Dual cores till q1 '06, you may want to re-check the validity of your 'sources', even if they are just the random voices in your head.



    Of course 'they' are not mentioning anything.Don't want to give the game away!! Who'd believe Apple would be using Intel chipsets and cpu's and that Dell are going to be building them in Poland!!! ;) Rumour of a million sq feet factory going up in a year or so there ;)

    Can't wait to see that guy eat his hat :D

    Watch the future:- it's only getting interesting.


  • Closed Accounts Posts: 4,943 ✭✭✭Mutant_Fruit


    I'll drink a pint of water through a used sock if quadcores are commercially available for the average consumer by Q1 06. I find it really unlikely that they'll jump from singlecores to quadcores in under a year, simply because software is just not ready for that. Very little is optimised properly for dual cores, and i assume further tweaking will be needed to make proper use of quad cores...


  • Registered Users, Registered Users 2 Posts: 6,946 ✭✭✭SouperComputer


    Who'd believe Apple would be using Intel chipsets and cpu's and that Dell are going to be building them in Poland!!!


    since the g5 a switch has been inevitable.

    id love to know your source for dell building apple, that would be tragic!


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 4,658 ✭✭✭Patricide


    Look what ive started, a big intel v. amd thread. The thing ive seen noticed is that the majority(not all though) of intel "fanboys" only know half the facts and jump to it as a conclusion while the amd guys well the majority of em anywasy seem to know there subject a lot more.This kinda gives me the reflection that the people that know best seem to pick amd which gives me the idea that its a better processor, well at least for gaming from what ive seen everywhere.


  • Registered Users, Registered Users 2 Posts: 17,471 ✭✭✭✭astrofool


    it really is best to read a few articles, find out the real facts, and make your own conclusions. If someone is a fanboy, their opinion automatically doesn't count.


  • Closed Accounts Posts: 3,354 ✭✭✭secret_squirrel


    since the g5 a switch has been inevitable.
    Why do you say that - IBM's Power output is going through the roof.

    The new G5's announced this week seem rather kick ass. The g5fx with less power usage than the p-m. The new high power G5mp's. Seems like apple left the party just as it was getting interesting....


  • Closed Accounts Posts: 1,685 ✭✭✭zuma


    since the g5 a switch has been inevitable.

    id love to know your source for dell building apple, that would be tragic!

    If DELL start to build apples...then a LOT of people in Cork will be pissed off when their jobs move to Poland!!!


  • Closed Accounts Posts: 2,279 ✭✭✭DemonOfTheFall


    Theres news all over the web about Dell's interest in building PCs that run OS X. They'd just be normal PCs except with tighter requirements placed on the hardware and a DRM chip on the mobo. Try a google for "Apple Dell" or something, bound to turn up a lot.

    As for the thing about "Fact: Dual Core extreme editons own at gaming" try checking out any benchmark at all on the net. I challenge you to find a single benchmark where an EE chip beats an FX chip at a game. Just aint gonna happen if the test is in any way indicative of real world performance.


  • Registered Users, Registered Users 2 Posts: 6,946 ✭✭✭SouperComputer


    Why do you say that - IBM's Power output is going through the roof.

    indeed the chip peforms excellently, but it produces too much heat for apple's liking.

    IBM it seems cant be arsed cooling it in a timely fashion and chip shortages have been a problem for apple even from the G3 days. I think pricing could be an issue too.


  • Closed Accounts Posts: 3,354 ✭✭✭secret_squirrel


    True - but if the specs on the G5Fx (which will probably power the last of the PowerPC powerbooks) are even close to being whats reported then they have cracked the temperature problems and then some.

    The specs are pegging thermal dissapation at around half a P-m! 10-15Watts if I remember right.

    MAybe they are just doing it to taunt apple ;)


  • Closed Accounts Posts: 1,643 ✭✭✭Gandalf23


    http://www.theinquirer.net/?article=24486

    Not sure if this contributes to the thread or not...


  • Registered Users, Registered Users 2 Posts: 1,017 ✭✭✭The_Thing


    Some interesting reading here.


  • Advertisement
  • Closed Accounts Posts: 1,502 ✭✭✭MrPinK


    I find it really unlikely that they'll jump from singlecores to quadcores in under a year, simply because software is just not ready for that. Very little is optimised properly for dual cores, and i assume further tweaking will be needed to make proper use of quad cores...
    I'd be suprised if people we writing code specifically for dual core. When you're writing a multithreaded program, you usually don't have a certain number of threads in mind. You use as many as the program logically needs. This could be 2 or 20 threads, and will vary during the executiuon of the program.


Advertisement