DUALIDAD ONDA PARTICULA PDF

Resumen Se presenta nuestra contribución por hacer de la dualidad onda- Partícula un fenómeno intuitivo a través de un análisis histórico que muestra los . PRINCIPALES CARACTERISTICAS DE DUALIDAD ONDA PARTICULA También llamada onda from ECBTI _1 at National Open and Distance. Oeuvre de Vanesa Muñoz, ” Dualidad Onda Partícula II”.

Author: Vudolkree Totaur
Country: Bermuda
Language: English (Spanish)
Genre: Literature
Published (Last): 27 December 2009
Pages: 104
PDF File Size: 4.11 Mb
ePub File Size: 4.76 Mb
ISBN: 710-2-41636-510-6
Downloads: 34134
Price: Free* [*Free Regsitration Required]
Uploader: Tuzil

Wave—particle duality is the concept in quantum mechanics that every particle or quantum entity may be partly described in terms not only of particles, but also of waves. It expresses the inability of the classical concepts “particle” or “wave” to fully describe the behavior of quantum-scale objects.

As Albert Einstein wrote: It seems as though we must use sometimes the one theory and sometimes the other, while at times we may use either. We are faced with partticula new kind of difficulty. We have two contradictory pictures of reality; separately neither of them fully explains the phenomena of light, but together they do.

Through the work of Max PlanckAlbert EinsteinLouis de BroglieArthur ComptonNiels Bohr and many others, current scientific theory holds that all particles exhibit a wave nature and vice versa. For macroscopic particles, pnda of their extremely short wavelengths, wave properties usually cannot be detected. Although the use of the wave-particle duality has worked well in physics, the meaning or interpretation has not been satisfactorily resolved; see Interpretations of quantum mechanics.

Bohr regarded the “duality paradox ” as a fundamental or metaphysical fact of nature. A given kind of quantum object will exhibit sometimes wave, sometimes particle, character, in respectively different physical settings.

He saw such duality as one aspect of the concept of complementarity. Werner Heisenberg considered the question further. He saw the duality as present for all quantic entities, but not quite in the usual quantum mechanical account considered by Bohr.

He saw it in what is called second quantizationwhich generates an entirely new concept of fields which exist in ordinary space-time, causality still being visualizable. Classical field values e. Turning the reasoning around, ordinary quantum mechanics can be deduced as a specialized consequence of quantum field theory.

Democritus ,the original atomist,argued that all things in the partiula, including light, are composed of indivisible sub-components light being some form of solar atom. He asserted that these rays were composed of particles of light. Beginning in and progressing over three decades, Isaac Newton developed and championed his corpuscular theoryarguing that the perfectly straight lines of reflection demonstrated light’s particle nature; only particles could dualidwd in such straight lines.

He explained refraction by positing that particles of light accelerated laterally upon entering a denser medium. Around the same time, Newton’s contemporaries Robert Hooke and Christiaan Huygens —and later Augustin-Jean Fresnel mathematically refined the wave viewpoint, showing that if light traveled at different speeds in different media such as water and airrefraction particyla be easily explained as the medium-dependent propagation of light waves.

The resulting Huygens—Fresnel principle was extremely successful at reproducing light’s behavior and was subsequently supported by Thomas Young ‘s discovery of interference by his double-slit experiment.

James Clerk Maxwell discovered that he could apply his previously discovered Maxwell’s equationsalong with a slight modification to describe self-propagating waves of oscillating electric and magnetic fields. It quickly became apparent that visible light, ultraviolet light, and infrared light were all electromagnetic waves of differing frequency.

The wave theory had prevailed or at least it seemed to.

While the 19th century had seen the success of the wave theory at describing light, it had particulx witnessed the rise of the atomic theory at describing matter. At the close of the 19th century, the reductionism of atomic theory began to advance into the atom itself determining, dualidsd physics, the nature of the obda and the operation of chemical reactions. Electricity, first thought to be a fluid, was now onfa to consist of particles called electrons.

This was first demonstrated by J. Thomson in when, using a cathode ray tubehe found that an electrical charge would travel across a vacuum. Since the vacuum offered no medium for an electric fluid to travel, this discovery could only be explained via a particle carrying a dualida charge and moving through the vacuum. This electron flew in the face of classical electrodynamics, which had successfully treated electricity as a fluid for many years leading to the invention of batterieselectric motorsdynamosand arc lamps.

More importantly, the intimate relation between electric charge and electromagnetism had been well documented following the discoveries of Michael Faraday and James Clerk Maxwell. Furthermore, classical electrodynamics was not the only classical theory rendered incomplete.

  BFW16A DATASHEET PDF

InMax Planck published an analysis that succeeded in reproducing the observed spectrum of light emitted by a glowing object. To accomplish this, Planck had to make an ad hoc mathematical assumption of quantized energy of the oscillators atoms of the black body that emit radiation.

Einstein later proposed that electromagnetic radiation itself is quantized, not the energy of radiating atoms. Black-body radiationthe emission of electromagnetic energy due to an object’s heat, could not be explained from classical arguments alone. The equipartition theorem of classical mechanics, the basis of all classical thermodynamic theories, stated that an object’s energy is partitioned equally among the object’s patricula modes.

But applying the same reasoning to the electromagnetic emission of such a thermal object was not so successful. That thermal objects emit light had been long known. Since light was known to be waves of electromagnetism, physicists hoped to describe this emission via classical laws. This became known as the duaalidad body problem. Since the equipartition theorem worked so well in describing the vibrational modes of the thermal object itself, it was natural to assume that it would perform equally well in describing the radiative emission of such objects.

But a problem quickly arose if obda mode received an equal partition of energy, the short wavelength modes would consume all the energy. This became clear when plotting the Rayleigh—Jeans paryicula which, while correctly predicting the intensity of long wavelength emissions, predicted infinite total energy as the intensity diverges to infinity for short wavelengths. This became known as the ultraviolet catastrophe.

Fernando Velázquez – Dualidad Onda-particula – Listen on Deezer

This was not an unsound proposal considering that macroscopic oscillators operate similarly when studying five simple harmonic oscillators of equal amplitude but different frequency, the oscillator oda the highest frequency possesses the highest energy though this relationship is not linear like Planck’s.

By demanding that high-frequency light must be emitted by an oscillator of equal frequency, and further requiring that this oscillator occupy higher energy than one of a lesser frequency, Planck avoided any catastrophe; giving an equal partition to high-frequency oscillators produced successively fewer oscillators and less emitted light. And as in the Maxwell—Boltzmann distributionthe low-frequency, low-energy oscillators were suppressed by the onslaught of thermal jiggling from higher energy oscillators, which necessarily dualiead their energy and frequency.

The most revolutionary aspect of Planck’s treatment of the black body is that it inherently relies on an integer number of oscillators in thermal equilibrium with the electromagnetic field. These oscillators give their entire energy to the electromagnetic field, creating a quantum of light, as often as they are excited by the electromagnetic field, absorbing a quantum of light and beginning to oscillate at the corresponding frequency.

However, once realizing that he had quantized the electromagnetic field, he denounced particles of light as a limitation of his approximation, not a property of reality. While Duzlidad had solved the ultraviolet catastrophe by using atoms and a quantized electromagnetic field, most contemporary physicists pargicula that Planck’s “light quanta” represented only flaws in his model. A more-complete derivation of black body radiation would yield a fully continuous and ‘wave-like’ electromagnetic field with no quantization.

However, in Albert Einstein took Planck’s black body model to produce his solution to another outstanding problem of the day: Since their existence was theorized eight years previously, phenomenon had been studied with the electron model in mind in physics laboratories worldwide. In Philipp Lenard discovered that the energy of these ejected electrons did not depend on the intensity of the incoming light, but instead on its frequency. So if one shines a little low-frequency light upon a metal, a few low energy electrons are ejected.

If one now shines a very intense beam of low-frequency light upon the same metal, a whole slew of electrons are ejected; however they possess the same low energy, there are merely more of them.

Partickla more light there is, the more electrons are ejected. Whereas in order to get high energy electrons, one must illuminate the metal with high-frequency light.

Like blackbody radiation, this was at odds with a theory invoking continuous transfer of energy between radiation and matter. However, it can still be explained using a fully classical description of light, as long as matter is quantum mechanical in nature.

More by Fernando Velázquez

Low-frequency light only ejects low-energy electrons because each electron is excited by the absorption of a single photon. Increasing the intensity of the low-frequency light increasing the number of photons only increases the number of excited electrons, not their energy, because the energy of each photon remains low. Only by increasing the frequency of the light, and thus increasing the energy of the photons, can one eject electrons with higher energy. Thus, using Planck’s constant h to determine the energy of the photons based upon their frequency, the energy of ejected electrons should also increase linearly with frequency; the gradient of the line being Planck’s constant.

  DECRETO LEI 74170 PDF

These results were not confirmed untilwhen Robert Andrews Millikanwho had previously determined the charge of the electron, produced experimental results in perfect accord with Einstein’s predictions. In fact, Millikan experimented for full ten years from aimed at disproving Einstein’s photoelectric equation.

However the value of Planck’s constant determined from his experiments came out to be very close to the original value.

Thus, the he established the validity of Einstein’s photoelectric equation instead of disproving it. While energy of ejected electrons reflected Planck’s constant, the existence of photons was not explicitly proven until the discovery of the photon antibunching effect, of which a modern experiment can be performed in undergraduate-level labs.

When Einstein received his Nobel Prize init was not for his more difficult and mathematically laborious special and general relativitybut for the simple, yet totally revolutionary, suggestion of quantized light. Einstein’s “light quanta” would not be called photons untilbut even in they represented the quintessential example of wave-particle duality. Electromagnetic radiation propagates following linear wave equations, but can only be emitted or absorbed as discrete elements, thus acting as a wave and a particle simultaneously.

InAlbert Einstein provided an explanation of the photoelectric effecta hitherto troubling experiment that the wave theory of light seemed incapable of explaining. He did so by postulating the existence of photonsquanta of light energy with particulate qualities.

In the photoelectric effectit was observed that shining a light on certain metals would lead to an electric current in a circuit.

Presumably, the light was knocking electrons out of the metal, causing current to flow. However, using the case of potassium as an example, it was also observed that while a dim blue light was enough to cause a current, even the strongest, brightest red light available with the technology of the time caused no current at all.

According to the classical theory of light dualkdad matter, the strength or amplitude of a light dualixad was in proportion to its brightness: Yet, oddly, this was not so.

Einstein explained this enigma by postulating that the electrons can receive energy from electromagnetic field only in discrete portions quanta that were called photons: Only photons of a high enough frequency above a certain threshold value could knock an electron free.

For example, photons of blue light had sufficient energy to free an electron from the metal, but photons of red light did not. One photon of light above the threshold frequency could release only one electron; the higher the frequency of a photon, the higher the kinetic energy of the emitted electron, but no amount of light using technology available at the time below the threshold frequency could release an electron.

To “violate” this law would require extremely high-intensity lasers which had not yet been invented. Intensity-dependent phenomena have now been studied in detail with such lasers. Einstein was awarded the Nobel Prize in Physics in for his discovery of the law of the photoelectric effect. De Broglie’s formula was confirmed three years later for electrons which differ from photons in having a rest mass with the observation of electron diffraction in two independent experiments.

At the University of AberdeenGeorge Paget Thomson passed a beam of electrons through a thin metal film and observed the predicted interference patterns. De Broglie was awarded the Nobel Prize for Physics in for his hypothesis. Thomson and Davisson shared the Nobel Prize for Physics in for their experimental work.

In his work on formulating quantum mechanics, Werner Heisenberg postulated his uncertainty principlewhich states:. Heisenberg originally explained this as a consequence of the process of measuring: Measuring position accurately would disturb momentum and vice versa, offering an example the “gamma-ray microscope” that depended crucially on the de Broglie hypothesis.

The thought is now, however, that this only partly explains the phenomenon, but that the uncertainty also exists in the particle itself, even before psrticula measurement is made.