Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Questions about the double slit experiment

  • 09-08-2008 12:57pm
    #1
    Registered Users, Registered Users 2 Posts: 4,850 ✭✭✭


    This one blows me away. Just to make sure I have it right in my head...

    The experiment shows that when shot through the two slits, the electrons make impressions on the back wall based on the range of possible locations they could be (ie the interference pattern).

    And it is only when we observe, that that range of possibilities has to be narrowed down to a determined pattern.

    So carrying on, does this imply that the true structure of reality as we know it is only determined through experience? And before this, it can only ever be a range of possibilities. So without our observation, the world is essentially organised chaos?

    I came across a video that shows another experiment that I think is related, the uncertainty principle using a lazer shone through a slit of which the size is reduced making the spot of light first shrink (as expected), but eventually expand when the size of the slit becomes small enough.

    Here's the vid btw;


    My interpretation of this is that the light expands because the exact direction of source of the light can't be determined and so it has to be represented by a range of possibilities, so what we are looking at on the screen is not the light itself, but where the light could be?

    So relating this to the double slit experiment, do they both show that the 'reality' is just a range of possibilities, and it is only our interpretation/observation that organises it in to something absolute and determined?


Comments

  • Registered Users, Registered Users 2 Posts: 3,457 ✭✭✭Morbert


    Cianos wrote: »

    My interpretation of this is that the light expands because the exact direction of source of the light can't be determined and so it has to be represented by a range of possibilities, so what we are looking at on the screen is not the light itself, but where the light could be?

    This interpretation isn't really correct. It's true that there is a range of possibilities associated with each photon in the laser beam, but each photon only strikes the detector screen at one location and one location only. If we were seeing where light could be, as opposed to where light is, then any given photon would produce a smeared light pattern on the screen, and you'd run into a bunch of physical violations. Instead, the smeared pattern is due to the countless number of photons striking the detector screen.
    So relating this to the double slit experiment, do they both show that the 'reality' is just a range of possibilities, and it is only our interpretation/observation that organises it in to something absolute and determined?


    It's hard to derive reality from quantum mechanics because QM isn't really a physical theory. It is two mathematical ideas (unitary evolution and stochastic state reduction) stuck together with masking tape. It works well insofar as it is a mathematical box where we put numbers and functions in and we get the correct numbers and functions out. But, epistemologically speaking, it makes very little sense, and most interpretations are little more than attempts to crowbar inappropriate classical concepts into a theory that clearly doesn't want them. So I would not subscribe to any idea of quantum reality because it will mostly be conjecture. QM is just weird, and very little insight can be gained from its current formalism.


  • Registered Users, Registered Users 2 Posts: 861 ✭✭✭Professor_Fink


    Morbert wrote: »
    It's hard to derive reality from quantum mechanics because QM isn't really a physical theory. It is two mathematical ideas (unitary evolution and stochastic state reduction) stuck together with masking tape. It works well insofar as it is a mathematical box where we put numbers and functions in and we get the correct numbers and functions out. But, epistemologically speaking, it makes very little sense, and most interpretations are little more than attempts to crowbar inappropriate classical concepts into a theory that clearly doesn't want them.

    What? Quantum mechanics is a physical theory. It describes physical systems with an unprecedented level of accuracy. The way measurement is thaught admittedly muddies the waters, but that's largely due to the historical belief that measurements needed to be tacked on to quantum mechanics, which it turns out is false. The fact is that it doesn't matter whether you take the Copenhagen interpretation or the Everett interpretation, the mathematics is the same.

    Physical theories predict observables. That's what they do, and what they are. Quantum mechanics does exactly this. It's only when you start asking questions about things which are not observables that you get into interpretations, which is philosophy rather than physics. There is no question that quantum mechanics is a well defined physical theory once you phrase it in terms of observables.


  • Registered Users, Registered Users 2 Posts: 3,457 ✭✭✭Morbert


    What? Quantum mechanics is a physical theory. It describes physical systems with an unprecedented level of accuracy. The way measurement is thaught admittedly muddies the waters, but that's largely due to the historical belief that measurements needed to be tacked on to quantum mechanics, which it turns out is false. The fact is that it doesn't matter whether you take the Copenhagen interpretation or the Everett interpretation, the mathematics is the same.

    Physical theories predict observables. That's what they do, and what they are. Quantum mechanics does exactly this. It's only when you start asking questions about things which are not observables that you get into interpretations, which is philosophy rather than physics. There is no question that quantum mechanics is a well defined physical theory once you phrase it in terms of observables.

    You are right about the successful predictions of observations. And QM has taught us a great deal about how the universe behaves. In this way, QM is a a physical theory. But the formalism of QM doesn't describe the reality of the quantum world. It is essentially a mathematical toolbox that allows us to calculate probabilities. This is why I believe it's futile to try and interpret the current formalism. There's nothing wrong with this; scientists, when they take a positivist approach, can use it to learn a lot about quantum systems. But it should be pointed out that if you try and use QM to understand 'reality' then you're going to run into trouble.


  • Registered Users, Registered Users 2 Posts: 861 ✭✭✭Professor_Fink


    Morbert wrote: »
    But it should be pointed out that if you try and use QM to understand 'reality' then you're going to run into trouble.

    You seem to be implicitly assuming that there is more to reality than is described by observables.

    As a side note, as quantum mechanics gives a deterministic evolution of the wave function it is possible to verify the wave function at any point by measuring a stabilizer of the system.


  • Registered Users, Registered Users 2 Posts: 4,850 ✭✭✭Cianos


    Morbert wrote: »
    This interpretation isn't really correct. It's true that there is a range of possibilities associated with each photon in the laser beam, but each photon only strikes the detector screen at one location and one location only. If we were seeing where light could be, as opposed to where light is, then any given photon would produce a smeared light pattern on the screen, and you'd run into a bunch of physical violations. Instead, the smeared pattern is due to the countless number of photons striking the detector screen.

    Where do the 'extra' photons come from?


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 861 ✭✭✭Professor_Fink


    Cianos wrote: »
    Where do the 'extra' photons come from?

    What extra photons?


  • Registered Users, Registered Users 2 Posts: 3,457 ✭✭✭Morbert


    You seem to be implicitly assuming that there is more to reality than is described by observables.

    I do think there is more to reality. The eigenvalues that are extracted by fiddling around with dynamical variables are enough for the purposes of most physicists because we can define them as classical quantities. These classical observations effectively describe our reality. So the 'reality' represented by state vectors, for example, isn't terribly important to most physicists. The question of why superposition states are never perceived doesn't keep many awake at night. But we still have the problem of visualising the nature and relationship between unitary evolution and state reduction regardless. We have two mathematical procedures with very little in common. These visualisation problems are absent from most other physical theories. Perhaps if some brave physicists devoted their careers to unearthing some new formalism of QM that consistently dealt with both unitary evolution and state reduction then we would have a better understanding of the reality of quantum systems. (Heck, an ideal formalism like that might even sort out unification problems.) But until then, I think I'm justified in discouraging the OP from trying to interpret the physical reality behind the strangeness of QM.

    You said earlier that it is a philosophical, rather than a scientific, issue. I pretty much agree. But since physicists are really the only ones with an in depth understanding of QM, they're the only ones who'll be able to sort out those philosophical issues.

    Where do the 'extra' photons come from?

    A laser beam is made up of many photons. They are there from the beginning, and aren't generated when the laser passes through the slit. If, instead of a laser beam, scientists fired one photon at a time through the slit, you would only see one dot on the detector screen per photon.


  • Registered Users, Registered Users 2 Posts: 861 ✭✭✭Professor_Fink


    Morbert wrote: »
    These classical observations effectively describe our reality. So the 'reality' represented by state vectors, for example, isn't terribly important to most physicists. The question of why superposition states are never perceived doesn't keep many awake at night. But we still have the problem of visualising the nature and relationship between unitary evolution and state reduction regardless. We have two mathematical procedures with very little in common.

    There isn't a problem about why we don't perceive superpositions. It's quite simple, because quantum mechanics is linear there is no operation which can determine if a state is in a superposition of basis states or not. The linearity of QM gaurantees that we cannot experience superpositions.

    I agree that the Copenhagen interpretation is ridiculous. I don't think anyone really takes it seriously anymore, because of the problem of defining what constitutes a measurement. Nearly all quantum physicists subscribe to some version of the Everett interpretation or into the shut up and calculate interpretation (which isn't really an interpretation). That said, QM is still thought based in the Copenhagen interpretation. Now, many philosophers find the Everett interpretation abhorent, but I don't really put much stock in philosophic arguments against it.

    In the Everett interpretation and in dynamical collapse theories, there is only one mathematical structure. The effect of measurement is found simply by taking the partial trace of the density matrix over the measurement system. But this is exactly the same as when you interact two quantum systems.

    Morbert wrote: »
    Perhaps if some brave physicists devoted their careers to unearthing some new formalism of QM that consistently dealt with both unitary evolution and state reduction then we would have a better understanding of the reality of quantum systems. (Heck, an ideal formalism like that might even sort out unification problems.) But until then, I think I'm justified in discouraging the OP from trying to interpret the physical reality behind the strangeness of QM.

    That is exactly what Hugh Everett did in his PhD thesis. We do have such a formalism. We can treat everything as unitary evolution, and get exactly the same results as in the Copenhagen interpretation. There is absolutely no need for collapse of the wave function.


  • Registered Users, Registered Users 2 Posts: 861 ✭✭✭Professor_Fink


    Morbert wrote: »
    A laser beam is made up of many photons. They are there from the beginning, and aren't generated when the laser passes through the slit. If, instead of a laser beam, scientists fired one photon at a time through the slit, you would only see one dot on the detector screen per photon.

    Actually what comes out of the business end of a laser is called a 'coherent state'. This is a particular type of superposition over photon numbers, so the number of photons in the beam is undefined.


  • Registered Users, Registered Users 2 Posts: 3,457 ✭✭✭Morbert


    There isn't a problem about why we don't perceive superpositions. It's quite simple, because quantum mechanics is linear there is no operation which can determine if a state is in a superposition of basis states or not. The linearity of QM gaurantees that we cannot experience superpositions.

    I agree that the Copenhagen interpretation is ridiculous. I don't think anyone really takes it seriously anymore, because of the problem of defining what constitutes a measurement. Nearly all quantum physicists subscribe to some version of the Everett interpretation or into the shut up and calculate interpretation (which isn't really an interpretation). That said, QM is still thought based in the Copenhagen interpretation. Now, many philosophers find the Everett interpretation abhorent, but I don't really put much stock in philosophic arguments against it.

    In the Everett interpretation and in dynamical collapse theories, there is only one mathematical structure. The effect of measurement is found simply by taking the partial trace of the density matrix over the measurement system. But this is exactly the same as when you interact two quantum systems.
    That is exactly what Hugh Everett did in his PhD thesis. We do have such a formalism. We can treat everything as unitary evolution, and get exactly the same results as in the Copenhagen interpretation. There is absolutely no need for collapse of the wave function.


    I was not aware that Everett's formalism/linearity of QM solved the issues of superposition. I know that it is postulated that conscious perception is entangled with quantum states in forms such as

    |u> = a|0>|perception of 0> + b|1>|perception of 1>

    So we would only perceive either 0 or 1 from an experiment. But how are states such as

    |u> =
    c(|0> + |1>)(|perception of 0> + |perception of 1>) +
    d(|0> - |1>)(|perception of 0> - |perception of 1>)

    forbidden?


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 861 ✭✭✭Professor_Fink


    Morbert wrote: »
    But how are states such as

    |u> =
    c(|0> + |1>)(|perception of 0> + |perception of 1>) +
    d(|0> - |1>)(|perception of 0> - |perception of 1>)

    forbidden?

    |u> =
    c(|0> + |1>)(|perception of 0> + |perception of 1>) +
    d(|0> - |1>)(|perception of 0> - |perception of 1>)

    = c(|0> + |1>)|perception of 0> + c(|0> + |1>)|perception of 1>) +
    d(|0> - |1>)|perception of 0> - d(|0> - |1>)|perception of 1>

    = [(c+d)|0> + (c-d)|1>]|perception of 0> + [(c+d)|0> - (c-d)|1>]|perception of 1>

    So this is equivalent to a perception of either 1 or 0, but has been made in a different basis. (Actually should take Tr_{You}(\rho_{Full System})

    The basic idea is that the density matrix resulting from tracing you out can be decomposed in a basis that composed of your brain's perceptions states. This is always possible. Actually, you probably want to add a state which contains something like "is in superposition", and you will find that there is no unitary operation which can set this bit correctly.


  • Registered Users, Registered Users 2 Posts: 3,457 ✭✭✭Morbert


    |u> =
    c(|0> + |1>)(|perception of 0> + |perception of 1>) +
    d(|0> - |1>)(|perception of 0> - |perception of 1>)

    = c(|0> + |1>)|perception of 0> + c(|0> + |1>)|perception of 1>) +
    d(|0> - |1>)|perception of 0> - d(|0> - |1>)|perception of 1>

    = [(c+d)|0> + (c-d)|1>]|perception of 0> + [(c+d)|0> - (c-d)|1>]|perception of 1>

    So this is equivalent to a perception of either 1 or 0, but has been made in a different basis. (Actually should take Tr_{You}(\rho_{Full System})

    The basic idea is that the density matrix resulting from tracing you out can be decomposed in a basis that composed of your brain's perceptions states. This is always possible. Actually, you probably want to add a state which contains something like "is in superposition", and you will find that there is no unitary operation which can set this bit correctly.

    I would have thought our perception states, if they are associated with a pair of orthogonal states, will generally not be orthogonal. We have no mathematical reason (that I know of anyway) to insist that our perception states are orthogonal, especially since we have no state reduction process. We could have

    |u> =
    |0>[(c+d)|perception of 0> + (c-d)|perception of 1>] +
    |1>[(c-d)|perception of 0> + (c+d)|perception of 1>]

    And I was taught that, even when we look at the schmidt decomposition, our eigenstates could be completely unrelated to what we would want for our classical expectations. The perception of either 1 or 0 isn't predicted.


  • Registered Users, Registered Users 2 Posts: 861 ✭✭✭Professor_Fink


    Morbert wrote: »
    I would have thought our perception states, if they are associated with a pair of orthogonal states, will generally not be orthogonal. We have no mathematical reason (that I know of anyway) to insist that our perception states are orthogonal, especially since we have no state reduction process. We could have

    |u> =
    |0>[(c+d)|perception of 0> + (c-d)|perception of 1>] +
    |1>[(c-d)|perception of 0> + (c+d)|perception of 1>]

    And I was taught that, even when we look at the schmidt decomposition, our eigenstates could be completely unrelated to what we would want for our classical expectations. The perception of either 1 or 0 isn't predicted.

    Sorry for the delay, I've been meaning to answer this but kept putting it off as I suspect it will be a long post. I'm not really sure why you think the Schmidt decomposition would be relevant here.

    In order to really talk about how we would perceive quantum systems if we too are a quantum system (as is done in the Everett interpretation) it is really necessary to introduce some new tools for describing quantum systems: density matrices. I am not sure if you have come accross this formalism before, but it is essential for dealing with quantum systems where we do not have access to the full system. It is fairly easy to see how this is related to how we see the world, since we make the decision about what we have percieved based only on the state of our brain, and not on a joint state of the entire system.

    The states we have dealt with so far (as in your example) are pure states: They can be described as a coherent superposition of basis states. It should be evident that this isn't quite enough to describe all quantum states. If we have an entangled state of two systems, such as |00> + |11>, then the state of part of a single qubit cannot be written as a pure state. Density matrices allow a more general description by allowing classical probability distributions of quantum states. For a pure state |psi> the corresponding density matrix is rho_psi = |psi><psi|.

    A general density matrix rho can be written as a weighted sum of pure state density matrices:
    rho = \sum_i p_i |psi_i><psi_i|
    where p_i is the relative probability with which each state occurs in the distribution. To obtain the density matrix for a subsystem A of a system A \tensor B (i.e. where one part is A and the other is B), we take the parrtial trace of the system over B. Thus rho_A = Tr_B (rho).

    So the density matrix for our observed environment is given by rho_observed = Tr_brain (rho_universe).

    So going back to your example:
    |u> =
    |0>[(c+d)|perception of 0> + (c-d)|perception of 1>] +
    |1>[(c-d)|perception of 0> + (c+d)|perception of 1>]

    The problem here is of course that you have picked a state that doesn't let you observe the system. I don't know if this is by deliberate construction, but it is not correlated with how our senses work. To really see how it works, imagine we start out with a pure state for the system to be observer:

    |u> = sum_j A_j |j>, where A_j are complex amplitutes and |j> form some orthonormal basis imposed by our senses such that S |j> |unobserved> = |j>|percieved_j>, where S is the operation performed by our senses. Admittedly this simplifies the fact that we may fail to register a photon, or something, so that S is not a maximally entangling operation, resulting in weak measurements, but this makes little difference and is easy to include (though more opaque, which is why I have left it out here).

    So initially the state of the whole system (our brain and the system we observe) is |u>|unobserved>. We then observe the system, so the state is S|u>|unobserved> = sum_j A_j |j>|percieved_j>. The state of the universe is then:

    rho_universe = sum_j sum_k A_j A_k* |j>|percieved_j><k|<percieved_k|

    I use * to denote complex conjugates. Now, taking the partial trace over our brain, we get the observed state of the system:
    rho_observer = Tr_brain(rho_universe)
    = sum_j d_ij A_j A_k* |j><k|

    d_jk is the delta function equal to 0 if j is not equal to k and 1 if j=k. So

    rho_observer = Tr_brain(rho_universe)
    = sum_j A_j A_j* |j><j|

    Which is exactly what we do get when we do such experiments, and agrees exactly with the predictions of the Copenhagen interpretation, but without the need for an ad hoc measurement formalism.

    It is important to note that it is the operation S which determines which basis the measurement is made in. Clearly S is always well defined, since it is simply the quantum description of the measurement process the system undergoes.


  • Registered Users, Registered Users 2 Posts: 7,046 ✭✭✭eZe^


    Ok, after 'reading' (don't think you can call it that :pac:) some of the posts between Morbert and Fink on said topic, I think I'm in the wrong profession. Maybe culinary arts is more my style. :P:P:P:P:P


  • Registered Users, Registered Users 2 Posts: 861 ✭✭✭Professor_Fink


    eZe^ wrote: »
    Ok, after 'reading' (don't think you can call it that :pac:) some of the posts between Morbert and Fink on said topic, I think I'm in the wrong profession. Maybe culinary arts is more my style. :P:P:P:P:P

    I wouldn't get to disheartened, I have had an extra 7 years to practice in.


  • Registered Users, Registered Users 2 Posts: 3,457 ✭✭✭Morbert


    Thanks for the reply.

    I have not come across density matrices in any of my standard quantum mechanics modules, though I have come across them in a quantum information module. And I brought up Schmidt decomposition because, if I remember correctly, it would allow us to consider orthogonal pair of perception states along with a pair of orthogonal states of the system. It was the best shot that I could think of for a mathematically preferred basis.

    Anyway, I'm wondering essentially how/why the basis |j> that corresponds to classical expectations is imposed by our senses. Why don't our senses impose some basis that contains linear superpositions of 'classical' states? This was the motivation behind my attempt to introduce/consider a different basis.


  • Registered Users, Registered Users 2 Posts: 861 ✭✭✭Professor_Fink


    Morbert wrote: »
    And I brought up Schmidt decomposition because, if I remember correctly, it would allow us to consider orthogonal pair of perception states along with a pair of orthogonal states of the system. It was the best shot that I could think of for a mathematically preferred basis.

    Yes, people who like dynamical collapse theories seem to be fond of the Schmidt decomposition, but it isn't really relevant in the Everett interpretation.
    Morbert wrote: »
    Anyway, I'm wondering essentially how/why the basis |j> that corresponds to classical expectations is imposed by our senses. Why don't our senses impose some basis that contains linear superpositions of 'classical' states? This was the motivation behind my attempt to introduce/consider a different basis.

    Good question. The most general kinds of measurements are known as positive operator valued measurements, but I am going to restrict myself to strong measurements which completely distinguish orthogonal states, since these are much more intuitive. In the case of a strong measurement the state of the measurement apparatus (for example our brain) becomes maximally entangled with the system, so that each we have a state as before: sum_j alpha_j |j>|psi_j>. I've used |psi_j> here to indicate that this can be a completely seperate basis. The states |psi_j> can easily be superpositions of classical states of the measurement system. However, this can always be rewritten as sum_j beta_j |phi_j>|j>, for some orthogonal states |phi_j> and amplitudes beta_j. Obviously the density matrices are identical, since it is just a rewriting of the state. Then when we take the trace over the measurement system we get the probability distribution sum_j beta_j*beta_j |phi_j><phi_j| with each state in the sum corresponding to a classical state of our brain/the detector.


  • Registered Users, Registered Users 2 Posts: 3,457 ✭✭✭Morbert


    Yes, people who like dynamical collapse theories seem to be fond of the Schmidt decomposition, but it isn't really relevant in the Everett interpretation.



    Good question. The most general kinds of measurements are known as positive operator valued measurements, but I am going to restrict myself to strong measurements which completely distinguish orthogonal states, since these are much more intuitive. In the case of a strong measurement the state of the measurement apparatus (for example our brain) becomes maximally entangled with the system, so that each we have a state as before: sum_j alpha_j |j>|psi_j>. I've used |psi_j> here to indicate that this can be a completely seperate basis. The states |psi_j> can easily be superpositions of classical states of the measurement system. However, this can always be rewritten as sum_j beta_j |phi_j>|j>, for some orthogonal states |phi_j> and amplitudes beta_j. Obviously the density matrices are identical, since it is just a rewriting of the state. Then when we take the trace over the measurement system we get the probability distribution sum_j beta_j*beta_j |phi_j><phi_j| with each state in the sum corresponding to a classical state of our brain/the detector.


    Hmm... I can see how we express the classical states of our brain, but are these classical states predicted by the formalism? i.e. If a physicist had no prior knowledge or expectations regarding classical observations, would they be able to predict classical observations with Everett's formalism?


  • Registered Users, Registered Users 2 Posts: 861 ✭✭✭Professor_Fink


    Morbert wrote: »
    Hmm... I can see how we express the classical states of our brain, but are these classical states predicted by the formalism? i.e. If a physicist had no prior knowledge or expectations regarding classical observations, would they be able to predict classical observations with Everett's formalism?

    Yes, as long as you have a decent model for what the detector does, then it is easy to calculate the both the mixed state, and the correct basis to interpret it in. The basis comes from the entangling operator applied.


  • Registered Users, Registered Users 2 Posts: 3,457 ✭✭✭Morbert


    Yes, as long as you have a decent model for what the detector does, then it is easy to calculate the both the mixed state, and the correct basis to interpret it in. The basis comes from the entangling operator applied.

    Sorry for the late reply. Do you have any book references or links to papers you'd recommend? After searching through the library, I can only find loosely related calculations invoking F.A.P.P. treatments of decoherence.

    I had always brushed MWI aside as little more than an alternative formalism. But this thread has certainly rekindled my interest.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 861 ✭✭✭Professor_Fink


    Well, it's largely how decoherence is actually modeled: An ancilla system is attached, an interaction Hamiltonian applied and then the partial trace is taken. That tends to be how master equations are derived. "Quantum computation and quantum information" by Nielsen and Chuang has a fairly decent, if somewhat short introduction to open quantum systems. Alternatively, if you want something a little more meaty, perhaps try "The theory of open quantum systems" by Heinz-Peter Breuer.

    EDIT: Or of course Hugh Everett's original PhD thesis.


  • Registered Users, Registered Users 2 Posts: 3,457 ✭✭✭Morbert


    'Tis me, back again.

    I didn't have much luck unearthing any theoretical motivation behind using a classical basis. In fact, I came across a few books that claimed the "preferred basis" question was still an issue. It seems that, like the state reduction process, our choice of basis is motivated by observations and experimental results, and not by a theoretical understanding of how dynamical variables behave between measurements. That's is not a bad thing per se, but I would say it leaves the physical theory incomplete. The operational approach just doesn't seem to tell us much about the physical reality of QM.


  • Registered Users, Registered Users 2 Posts: 861 ✭✭✭Professor_Fink


    Morbert wrote: »
    The operational approach just doesn't seem to tell us much about the physical reality of QM.

    Some philosophers argue that only the operators really exist, and that the wave function doesn't exist per se. I think it makes little difference. The reason we get the classical basis tends to be because when local energy eigenstates exist, then transitioning between them requires some fixed amount of energy transfer, and is a slower process than dephasing, which occurs when the gap between enrergy levels is shifted slightly due to some perturbation to the system hamiltonian (usually a change in the magnetic field due to the presence of a spin active particle).


  • Registered Users, Registered Users 2 Posts: 689 ✭✭✭JoeB-


    Hmmm.. it's all very technical...

    Would someone talk about Penrose's Objective Reduction theory please... it would seem to solve the measurement problem nicely. It's basically that two states in superposition will develop a difference stress and as the states evolve the stresses will increase, at a certain threshold the state will collaspe by itself, an objective reduction. This sounds very credible to me, is it not?


    I am fairly vague about this but could it not be that we don't perceive superpositions of states because for us to perceive any event requires a 'movement' of so many atoms, molecules etc in our brains that Penroses objective reduction threshold is reached and the superimposed state will always have collasped into one actual reality before conciousness can percieve it?


  • Registered Users, Registered Users 2 Posts: 861 ✭✭✭Professor_Fink


    Would someone talk about Penrose's Objective Reduction theory please... it would seem to solve the measurement problem nicely. It's basically that two states in superposition will develop a difference stress and as the states evolve the stresses will increase, at a certain threshold the state will collaspe by itself, an objective reduction. This sounds very credible to me, is it not?

    Sounding credible doesn't make up for a lack of evidence. There is currently no reason to believe that any modification to quantum mechanics is required. The measurement problem is largely a problem for philosophers rather than physicists.


  • Registered Users, Registered Users 2 Posts: 689 ✭✭✭JoeB-


    Ok... but is it (Objective Reduction) not a testable proposal?

    Some experiments can be devised where superimposed photons are sent on long journeys... this may work differently if Penrose is correct in that the superposition may always be lost in such an experiment due to Objective Reductions taking place.


    Penrose also mentions the situation where a single quantum event or possibility may be able to enter our conciousness... in that our eye may have cells that can detect a single photon.. thus opening up the possibilty of having a super imposed state of 'some human seeing something' and 'not seeing something'... this is nothing really to do with the current thread, I just find it very interesting.


  • Registered Users, Registered Users 2 Posts: 3,457 ✭✭✭Morbert


    Sounding credible doesn't make up for a lack of evidence. There is currently no reason to believe that any modification to quantum mechanics is required. The measurement problem is largely a problem for philosophers rather than physicists.

    But, as I'm sure you already know, physicists are having a difficult time unifying Quantum Mechanics with General Relativity. A new formalism might be the key. That's the motivation.

    It would be naive to say the current formalism should be abandoned, as it is clearly very powerful. But I believe at least some funding should be allocated to any physicist/mathematician who wants to try and build QM from scratch again.


  • Registered Users, Registered Users 2 Posts: 861 ✭✭✭Professor_Fink


    Morbert wrote: »
    It would be naive to say the current formalism should be abandoned, as it is clearly very powerful. But I believe at least some funding should be allocated to any physicist/mathematician who wants to try and build QM from scratch again.

    The problem seems to be with general relativity rather than quantum mechanics, hence all the quantum theories of gravity. That said, there have been a lot of alternative approaches suggested, and funding has to come from somewhere. However, I doubt Penrose has trouble finding funding.

    but this shouldn't be an argument about what should be funded, but rather what is the correct model of the universe, and so far I see no evidence to bolster his case.


  • Registered Users, Registered Users 2 Posts: 3,457 ✭✭✭Morbert


    oops, missed this...
    Some philosophers argue that only the operators really exist, and that the wave function doesn't exist per se. I think it makes little difference. The reason we get the classical basis tends to be because when local energy eigenstates exist, then transitioning between them requires some fixed amount of energy transfer, and is a slower process than dephasing, which occurs when the gap between enrergy levels is shifted slightly due to some perturbation to the system hamiltonian (usually a change in the magnetic field due to the presence of a spin active particle).

    I'm not sure I follow... I know that decoherence is used to explain the loss of quantum mechanical coherence, but how do you ensure orthogonality?

    "The Basis Problem in Many-Worlds Theories", Henry P. Stapp, Lawrence Berkeley National Laboratory, University of California Berkeley, California 94720

    "Specifically, the effect of environmental decoherence is to reduce a typical state to a mixture of states...

    ...This decoherence mechanism eliminates certain interference effects, but it does not solve the basis problem. There will be a continuum of these Gaussians, (3.1), and they overlap: i.e., they are not orthogonal...

    A natural candidate for the needed denumerable set of orthogonal subspaces associated with the continuum of states (3.1) is the set of subspaces defined by the different eigenvalues of the reduced density matrix associated with that mixture of states. That candidate is, in fact, the one chosen by Deutsch in his 1985 paper. However, that choice is not satifactory. One wants states that are highly localized, like a classical state. But the eigenstates of the reduced density matrix associated with the states (3.1), i.e. the eigenstates of a density matrix of the form... are spread out over large distances, rather than being compressed into small regions, like the states (3.1) themselves are. Taking the eigenstates of the reduced density matrix goes in the wrong direction."


    I am not terribly familiar with the word "dephasing" so perhaps there's where the confusion lies.
    The problem seems to be with general relativity rather than quantum mechanics, hence all the quantum theories of gravity. That said, there have been a lot of alternative approaches suggested, and funding has to come from somewhere. However, I doubt Penrose has trouble finding funding.

    but this shouldn't be an argument about what should be funded, but rather what is the correct model of the universe, and so far I see no evidence to bolster his case.

    I don't understand why the majority of physicists believe we should start with QM, and extend the theory to the realm of General Relativity, instead of the other way around.

    And funding is finite (especially for theoretical matters) but is it being allocated fairly? There are far too many string theorists in theoretical physics departments in my humble opinion.

    And it's not that current Quantum Mechanics is 'incorrect'. Instead, its current formalism may be hiding some things. In the same way that Newtonian mechanics is not incorrect, but alternative formalisms, such as Hamiltonian Mechanics, can often give us a much clearer picture.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 309 ✭✭Decerto


    Just as a matter of interest what are your levels of study into physics, only in first year tp in trinity but im curious as to when i will be encountering this stuff


  • Registered Users, Registered Users 2 Posts: 861 ✭✭✭Professor_Fink


    Decerto wrote: »
    Just as a matter of interest what your levels of study into physics, only in first year tp in trinity but im curious as to when i will be encountering this stuff

    I've a DPhil, and now have DPhil students of my own. You may not ever directly come across the measurement problem during the undergraduate course, but if you do, it will probably be about third year.


  • Registered Users, Registered Users 2 Posts: 3,457 ✭✭✭Morbert


    You will encounter Quantum mechanics seriously in third year, though you won't really be exposed to density matrices unless you take some of the electronic structure modules in 4th year (I don't think Trinity offers Quantum computation anymore). The measurement problem never really comes up either way.

    I have an undergrad degree in physics (half theoretical, half experimental). I'm currently in Trinity completing a masters in computational physics.


  • Registered Users, Registered Users 2 Posts: 861 ✭✭✭Professor_Fink


    Sorry for the delay in replying. I suspect this is going to be a long post, so I've been putting it off.

    Morbert wrote: »
    I'm not sure I follow... I know that decoherence is used to explain the loss of quantum mechanical coherence, but how do you ensure orthogonality?

    ...

    I am not terribly familiar with the word "dephasing" so perhaps there's where the confusion lies.

    Dephasing is a specific kind of decoherence, which does have a prefered basis. In particular is generally means the loss of phase coherence between classical states. Take an electron spin as an example. The electron sits in a magnetic field which induces an energy gap G between its two energy eigenstates |spin-up> and |spin-down>. These are the two classical states that we tend to end up with after a long time exposed to a noisy environment.

    We can write any state |psi> can be written as alpha*|spin-up> + beta*|spin-down>. As these are energy eigenstates, the time evolution is given by alpha*|spin-up> + exp(-i*G*t)*beta*|spin-down>. Because of the energy gap, spin flip errors are unlikely to occur, since the rate at which they occur must give rise to a Boltzman distribution, which makes them exponentially unlikely in the size of the energy gap (as they require an energy transfer with the environment of energy G). Dephasing is different. Dephasing comes about when there is a small change in the system Hamiltonian (i.e. a small perturbation to the magnetic local field). The evolution of the system then becomes alpha*|spin-up> + exp(-i*[G+dG]*t)*beta*|spin-down>, where dG is the perturbation to the expected Hamiltonian. If we compare this with the expected state |psi(t)> we get a density matrix cos(dG*t)^2|psi(t)><psi(t)| + sin(dG*t)^2|error(t)><error(t)|, where <error(t)|psi(t)> = 0. Since there is no energy gap associated with this error mechanism, it acts on a much faster time scale, and is usually the dominant form of decoherence in physical systems.

    As I say, decoherence generally has a prefered basis due to the physics of the situation. My understanding is that basis choice is no longer considered a real problem with the many worlds interpretation, due to better understanding of the physics behind decoherence mechanisms (as above).


    I don't understand why the majority of physicists believe we should start with QM, and extend the theory to the realm of General Relativity, instead of the other way around.

    Well, quantum mechanics has certainly been better tested than general relativity, but that is largely beside the point. There are many many reasons, but I'll give you just one to think about. One major success in understanding how the universe operates at a fundamental level is the demonstration of violations of Bell's inequalities. These prove that at a fundamental level the universe supports stronger non-local correlations than classical mechanics (including relativity) allows for. Quantum mechanics fits the bill exactly, and the success in quantizing various field theories has been phenomenal. Unfortunately gravity doesn't quantize nicely, so we know there is a problem somewhere.
    Morbert wrote: »
    And funding is finite (especially for theoretical matters) but is it being allocated fairly? There are far too many string theorists in theoretical physics departments in my humble opinion.

    Well, string theory is the dominant and most progressed of the theories that go beyond quantum mechanics. I think the funding reflects that. I think it is important that until we can test these theories, that we should explore many avenues, but the reality of science funding is that the most developed theories will get the largest slice of the funding.
    Morbert wrote: »
    And it's not that current Quantum Mechanics is 'incorrect'. Instead, its current formalism may be hiding some things. In the same way that Newtonian mechanics is not incorrect, but alternative formalisms, such as Hamiltonian Mechanics, can often give us a much clearer picture.

    Well, Hamiltonian and Lagrangian mechanics are completely equivalent to Newtonian mechanics. That's not the case with these theories you are refering to. The Schrodinger and Heisenberg pictures are an analogue for quantum mechanics, but Penrose's stuff doesn't fall into that category.


  • Registered Users, Registered Users 2 Posts: 1,454 ✭✭✭bogwalrus


    would love to study qm. Though i dont know much of the technical jargon you guys are on about Im creating one hell of an interesting picture of it in my head:)

    Maybe next year ill apply as a mature student.

    Great reading anyway guys. Keep it up.


Advertisement