Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Heat -- Radiation...

Options
  • 30-07-2009 2:58pm
    #1
    Closed Accounts Posts: 184 ✭✭


    HEre is something i was wondering ...

    Heat from the sun gets here via Radiation, is that part of the light ?

    as the light from the sun is so "dense" ... or is the heat a seperate part ..

    it must be part of the light as this is why it is hotter the more south (or north for our friends in the SH ;) ) one goes, the same amount of light is focused on less area....


Comments

  • Registered Users Posts: 1,230 ✭✭✭spideog7


    Since I've been browsing here frequently trying to keep my brain alive I shall give my tuppence worth.

    Yes and no. Radiated heat is infra-red which is on the same spectrum as visible light but isn't "light" per se, it also comes from the sun to produce a heating effect, however Wikipedia says:
    ...many people attribute all radiant heating to infrared light and/or to all infrared radiation to being a result of heating. This is a widespread misconception, since light and electromagnetic waves of any frequency will heat surfaces that absorb them. Infrared light from the Sun only accounts for 49%[9] of the heating of the Earth, with the rest being caused by visible light that is absorbed then re-radiated at longer wavelengths.

    So there you have it half of the heat from the sun comes from visible light (I didn't know that!) but I guess that goes for excitation by any electromagnetic wave, which explains the heating effect of mobile phones etc.

    As for the differences due to your location on the earth, this is to do with the axis of the earth and whether the northern hemisphere points away from (Winter) or toward (Summer) the sun (and similarly the southern hemisphere). The distance that radiation from the sun has to travel through the atmosphere to get to you is effected by this tilt. But I guess if 50% of the heat is due to radiated visible light then the density of this light on the ground will also effect that.


  • Registered Users Posts: 1,155 ✭✭✭SOL


    With regards the why is it warmer at the equator, or atleast why does the sun warm it more you can visualise this easily by reducing the concept to two dimensions, draw a circle and then imagine a parallel light source, you will notice that as the curve increases (ie towards the poles) the same ammount of light is spread over a larger surface area...


  • Closed Accounts Posts: 184 ✭✭chezzer


    Thanks guys ...

    but with this ...

    As for the differences due to your location on the earth, this is to do with the axis of the earth and whether the northern hemisphere points away from (Winter) or toward (Summer) the sun (and similarly the southern hemisphere). The distance that radiation from the sun has to travel through the atmosphere to get to you is effected by this tilt. But I guess if 50% of the heat is due to radiated visible light then the density of this light on the ground will also effect that.


    I think this is wrong , the distance wouldn't effect it ... it's the angle as SOL says ...
    more light in same area..


  • Registered Users Posts: 861 ✭✭✭Professor_Fink


    spideog7 wrote: »
    So there you have it half of the heat from the sun comes from visible light (I didn't know that!) but I guess that goes for excitation by any electromagnetic wave, which explains the heating effect of mobile phones etc.

    Actually almost all of that effect is simply caused by pressing something to your head. Phones tend not to cause noticable heating. This is mostly due to the fact that the radiation is in the form of microwaves (which are pretty low energy) at a frequency which doesn't particularly favour interaction with most common matter.


  • Registered Users Posts: 1,230 ✭✭✭spideog7


    chezzer wrote: »
    Thanks guys ...

    but with this ...

    As for the differences due to your location on the earth, this is to do with the axis of the earth and whether the northern hemisphere points away from (Winter) or toward (Summer) the sun (and similarly the southern hemisphere). The distance that radiation from the sun has to travel through the atmosphere to get to you is effected by this tilt. But I guess if 50% of the heat is due to radiated visible light then the density of this light on the ground will also effect that.


    I think this is wrong , the distance wouldn't effect it ... it's the angle as SOL says ...
    more light in same area..

    I didn't say it wasn't to do with the angle, but the tilt exacerbates this effect by elongating the light footprint even further in the Winter and reducing it in the Summer. Also bearing in mind that the atmosphere is made up of a lot of particulate matter, the intensity of the light is effected by the distance it travels through the atmosphere. This effects the intensity of the light on the ground along with the intensity of the infra-red radiation on the ground. Try shining a light through smoke or fog and see how the intensity gets effected.

    So yes the angle does have an effect but combined with the reduced intensity already then your reducing the intensity and spreading it over a larger area.
    Actually almost all of that effect is simply caused by pressing something to your head....This is mostly due to the fact that the radiation is in the form of microwaves (which are pretty low energy) at a frequency which doesn't particularly favour interaction with most common matter.

    True but it was just a side comment. Although some microwave ovens do use 915MHz while the most common GSM band here in Ireland is 900MHz. Of course the power output is much, much, much smaller, which would be a greater determining factor than the frequency.

    Yes visible light has more energy than say microwave and infra-red which is something I never really thought about so I guess my mobile phone example wasn't the greatest.

    EDIT: Got my electromagnetic spectrum a bit mixed up there, fixed now!!


  • Advertisement
  • Registered Users Posts: 32,417 ✭✭✭✭watty


    All microwave ovens are 2.5GHZ = 2500MHz ISM band. About 500W to 1500W Average power.

    915MHz doesn't work, no microwave oven uses it.

    Mobile phones in Europe use 900, 1800 and 2100 MHz.

    Average & peak power is 0.3W on 3G (CDMA 2100MHz) at longer range and 0.01W close to mast.

    On GSM (900 & 1800MHz) the peak power could be nearly 2W at long range, but it's TDMA, so average power is typically 125mW = 0.125W. Again highest further from Mast.

    Most of the heat is losses in the electronics. 3G/HSPA phone may create 2W average of heat as there is much electronics apart from transmitter and the transmitter will generate about 1W of heat. Hold that against your ear, the ear can't cool and is heated by the waste heat of the phone. A double heating effect.

    Ironically if there is a risk from Mobiles, the handset is about 10,000 more RF than standing under a Mast (which has less power than 200m away as it's a cone or petal/doughnut shape of transmission, not omnidirectional). More masts means less phone power. No ill effects from RF at 100kHz to 10GHz ever shown other than heating at high power (power needed depends on frequency) or Cataracts from looking too close at open Microwave feeds (2Ghz to 100GHz, which is the Radio equivalent of peering into a laser, about 1000 times power density of an aerial),


  • Registered Users Posts: 1,230 ✭✭✭spideog7


    watty wrote: »
    All microwave ovens are 2.5GHZ = 2500MHz ISM band. About 500W to 1500W Average power.

    915MHz doesn't work, no microwave oven uses it.

    Mobile phones in Europe use...

    Most of the heat is losses in the electronics...

    First of all apologies for going completely OT but:


    All the common household Microwave ovens use the assigned 2.4Ghz ISM band yes. But that's not to say lower frequencies don't work:
    http://books.google.com/books?id=SJhrqEuJoRIC&lpg=PA114&ots=GK8ax-0K25&dq=microwave%20frequencies%20915Mhz&pg=PA114#v=onepage&q=&f=false

    http://www.industrialmicrowave.com/faqs.htm#seven

    915MHz is not available unlicensed in countries (mostly Europe) where we use the 900MHz GSM band.

    Like I said my reference to the "heating effect" of mobile phones was an off the cuff remark. I wasn't suggesting thats why a phone gets hot or why your ear gets hot nor was I equating the effect of a mobile phone to the effect of a microwave oven although the principle I was referring to is the same. I'm not dumb, I understand both Ohms Law and Joule's Law. In fact common sense would lead one to understand that if you hold your hand over your ear it's gonna get hot, that doesn't mean your hand is emitting a high frequency signal.

    I would be inclined to disagree with the numbers you threw down there as regard the heat from a mobile phone, but that's not for here.

    Anyway I'm glad I attempted to discuss the original topic because I learned something new from it (not about mobile phones :p ), thats what this forum is all about right.


  • Closed Accounts Posts: 184 ✭✭chezzer


    Thanks all !!!

    Damn this is a good site !!!!


  • Registered Users Posts: 32,417 ✭✭✭✭watty


    915MHz is available in USA, but only useful for very specialist applications at very high powers, not for hotel/cafe/home food preparation. Most materials are simply not lossey enough at that frequency. Even lower frequencies below 30MHz are used fro RF induction heating or RF welding, but very concentrated or very high peak powers or specialist materials. Microwave ovens outside of an Industrial Process for general purpose use are higher than 3G frequencies and over 1000 times the power of a Mobile.

    On the Mobile phones the heat I mentioned is the worst case scenario. I've designed gear for 900MHz band using Cable TV Trunk amp chips and Mobile phone chips. A 77 channel capable amplifier running 1W output to coax takes about 7W from the 12V supply and generates 6W of heat. The Mobile phone chips are thankfully more efficient. The EDGE and HSPA need higher linearity than basic GSM or 3G as a higher QAM is used (up to 16QAM if SNR allows), thus "smart phone" Transmitters are typically a bit less efficient (more heat at full power) than "dumb" phones that only do GMSK on GSM and QPSK W-CDMA on 3G.


  • Registered Users Posts: 1,230 ✭✭✭spideog7


    I agree with you like I said:
    Although some microwave ovens do use 915MHz while the most common GSM band here in Ireland is 900MHz. Of course the power output is much, much, much smaller, which would be a greater determining factor than the frequency.
    I was refering to the power output of the mobile phone being much smaller in case you thought it was the other way around.

    Resonance frequencies have a large role to play in transferring energy from an electromagnetic wave, so equipment has to be designed for the specific application in mind.

    The only reason I disagreed with your power output figures was that I don't realistically see that kind of power being supplied from a small battery. A typical Li-ion battery runs at around 3.7 volts so in order to supply 3W of power it would mean the phone drawing a current of ~800mA and the battery wouldn't have a very long lifetime. Nonetheless I think we can agree that one of the greatest problems hindering modern electronics is excessive heat and more importantly heat dissipation.

    I did see a viral video going around one time with someone popping popcorn with several mobile phones. This myth was debunked by someone doing basic maths, essentially the phone doesn't have enough power in it's battery (even if by some miracle 100% of it was converted into heat in the corn kernel) to pop it.

    EDIT: Somehow I missed this:
    Cataracts from looking too close at open Microwave feeds (2Ghz to 100GHz, which is the Radio equivalent of peering into a laser...
    :eek:
    I had never heard that, surely a laser is much higher frequency (>THz) ?


  • Advertisement
  • Registered Users Posts: 259 ✭✭HIB


    http://www.astronomy.org/programs/seasons/index.html#reasons

    This is a great site that explains the seasons really well.

    Briefly (for anyone too lazy or too busy to read it!)

    The curved surface of the earth causes the equatorial regions to be warmer than the regions near the poles. This is because of the effect of the sun's energy being spread over a larger surface area. Exactly the same effect that has been described in a number of posts here.

    The seasons are a result of the tilt of the earth's axis. This causes the Northern hemisphere (for example) to tilt away from the sun in the winter and towards the sun in the summer. This causes in a change in the length of the days and the height of the sun in the sky. The result is cooler temperatures in winter than in summer.

    It is neatly explained in that site. Plenty of good diagrams as well.


  • Registered Users Posts: 32,417 ✭✭✭✭watty


    spideog7 wrote: »
    The only reason I disagreed with your power output figures was that I don't realistically see that kind of power being supplied from a small battery. A typical Li-ion battery runs at around 3.7 volts so in order to supply 3W of power it would mean the phone drawing a current of ~800mA and the battery wouldn't have a very long lifetime.

    Look at talktime vs standby. The "talktime" is based on maybe 50mW TX power rather than full power.

    Then consider that "talktime" is based on the typical case of a good signal, so the phone transmits at 1/4 to 1/8th power. Double the distance and you need 4x the transmit power. Ohmic losses mean that at full power the % heat waste is higher. You'll feel how much hotter the phone gets if you can talk with none to one signal bar. Esp. on 3G.

    A low power laser can burn a spot on your retina or a spot on paper even though less power than a torch, because it's very concentrated to a small area and coherent. It's only this sense in which up close to the open end of a waveguide or Dish feed is dangerous.

    Assume a 1.2m VSAT dish (4W satellite uplink). The waveguide "tube" is about 22mm. So ratio of area is about 3000:1

    Rather worse than difference between a torch and laser.
    of course laser is maybe 50mW and torch 5W. It's about 500:1 ratio of areas, but the torch is non-coherent and very unfocused, so call it 100mW on the Eye. That makes a 50mW laser still 250x more energy on the retina compared with staring direct into a torch close up.

    You can see that waveguides on 10GHz and upwards transmitters are very very dangerous compared to Aerials or Dishes on masts.

    Radar is very similar to Microwave oven technology, it's about 100W to 500W average power, but in 10uS or shorter pulse which gives 300x range and allows the receiver time to hear an echo. Thus the waveguide outlet for a radar dish at same power as a Microwave oven is almost 200x more concentrated power. It's at least 200,000x more damaging to stick your ear to it than beside a mobile phone :(


  • Registered Users Posts: 861 ✭✭✭Professor_Fink


    watty wrote: »
    A low power laser can burn a spot on your retina or a spot on paper even though less power than a torch, because it's very concentrated to a small area and coherent. It's only this sense in which up close to the open end of a waveguide or Dish feed is dangerous.

    Actually coherence has nothing to do with it. Columnated incoherent light would do the same thing.
    watty wrote: »
    Assume a 1.2m VSAT dish (4W satellite uplink). The waveguide "tube" is about 22mm. So ratio of area is about 3000:1

    Rather worse than difference between a torch and laser.
    of course laser is maybe 50mW and torch 5W. It's about 500:1 ratio of areas, but the torch is non-coherent and very unfocused, so call it 100mW on the Eye. That makes a 50mW laser still 250x more energy on the retina compared with staring direct into a torch close up.

    You can see that waveguides on 10GHz and upwards transmitters are very very dangerous compared to Aerials or Dishes on masts.
    ...
    It's at least 200,000x more damaging to stick your ear to it than beside a mobile phone :(

    Things don't quite work that way. Comparing power doesn't really tell you anything about how damaging the particular radiation is. It very strongly depends on the interaction cross-section, which in turn depends on resonances in energy structure of the matter being exposed.


Advertisement