top of page
  • Writer's pictureTriple Helix

The Origins of Quantum Mechanics: Escaping Classical Determinacy

Updated: Feb 6

Author: Parsa Lajmiri ‘26

Editor: Jasper Lincoln ‘25

In the early 20th century, the departure from classical mechanics to quantum mechanics destabilized nearly all of physicists’ long-held beliefs about the properties of reality and the nature of the physical world. While classical mechanics describes the interactions and motions of objects in a way that we can intuit and observe in the macroscopic world, quantum mechanics almost seems counterintuitive in the way it describes the microscopic world; it requires a strict acceptance of mathematical assumptions that we cannot quite explain yet – which may sound frustrating – but the fundamental laws of quantum mechanics together translate into a beautiful, harmonious understanding of the universe.

The transition into quantum mechanics began rather humbly, with the investigation of a blackbody, a theoretical object that constantly absorbs and radiates energy at all frequencies. Blackbody radiation is a phenomenon that we can easily observe – endowing objects in the universe with the ability to perpetually glow. If you heat a piece of metal, the amount of energy it absorbs will increase with temperature; eventually, the metal’s glow will evolve into visible light, appearing as red, orange, yellow, white, blue, and then finally violet at the highest possible temperatures. Physicists created a blackbody spectrum to model the behavior of an object’s radiation with respect to temperature, but the explanation for this behavior was still to be determined.

With this goal in mind, Lord Rayleigh and James Jean applied the equipartition theorem to the blackbody spectrum. The equipartition theorem states that energy is evenly distributed across all variables of motion, at a value of E=1/2nKT, where T is the temperature, K is Boltzmann’s constant, and n is the number of particles in the system. Consider two particles in a vacuum that can move down or up, left or right, and/or back or forth, which corresponds to three variables of motion. The total energy for these two particles would be E=1/2(2 particles)(K)(T) +1/2(2 particles)(K)(T) 1/2(2 particles)(K)(T)=3KT.  The more variables of motion a system has, the more energy it can absorb from its surroundings. When Rayleigh and Jean studied the variables of motion inside a heated object, their theory predicted that the intensity of the blackbody radiation increased proportionally with the blackbody’s temperature and the frequency of radiation.

However, this theory had a critical inaccuracy — it presupposed that energy was continuous. Since there was also no limit placed on how high the frequency of radiation could get, it was determined that the blackbody emitted an infinite amount of total energy. This prediction did not fall in accordance with how objects actually behave. Consider an oven, which is very close to an ideal blackbody because its walls were designed to absorb almost all incoming light, heating up. All of the light absorbed in the oven is converted into heat. From the classical lens, the emitted energy in this oven would approach infinity, meaning that opening your oven after cooking some pizza would lead to your demise. Such a scenario, preposterous and evidently in disagreement with reality, became known as the “ultraviolet catastrophe.”

In 1900, Max Planck proposed a quantization hypothesis that suggested energy had a minimum value of the universe at which particles could vibrate; he postulated that energy was not continuous but rather came in discrete bundles called quanta. While not understanding the physical meaning of this phenomenon, he discovered the direct proportionality between energy and frequency while exploring derivations of the blackbody spectrum, represented by the formula E=hf, where h is Planck’s constant and f is the frequency of a blackbody. While Planck had mathematically solved the ultraviolet catastrophe, Einstein would later uncover the physical significance of Planck’s contribution later in the 20th century (2).

According to the classical view, while heating up a block of metal, the intensity at which the light shined onto the metal determined the amount of energy emitted. But Einstein refuted the traditional view, proposing to treat the light waves instead like collections of particles. His ingenuity provided the sought-after explanation for the photoelectric effect, an experimental result incompatible with the assumption that light existed in electromagnetically continuous waves. The photoelectric effect demonstrated that the electrons on the block of metal were bound by electric forces, but those electrons required a threshold amount of energy, independent of the light’s intensity, in order to jump the block of metal. Einstein postulated that light was actually composed of discrete, massless packets of energy called photons, all of which were endowed with the same amount of energy.  Therefore, an increase in the rate at which the photons are fired, also known as the intensity of the light, could not have an effect on whether an electron on the metal would absorb enough energy. It would contribute to the amount of electrons that would be released from the metal into the air. Instead, it is the specific threshold frequency of the photon that is required to determine the photon’s endowed energy. Einstein extended Planck’s formula E=hf to describe the fundamental physical entities of light by the formula; this resulting energy, directly proportional to the frequency of light, would determine whether a single electron would jump the block of metal. This phenomenon became known as light’s wave-particle duality since an electromagnetic wave, light, was described as a stream of particles (3).

Einstein’s revolutionary explanation of the photoelectric effect compounded this discovery of light's wave-particle duality with a kick to classical expectations. Soon after, in 1923, French physicist Louis de Broglie hypothesized that particles like electrons shared this dual property with light. To further abstract this phenomenon, he visited Einstein’s mass-energy equivalence equation for particles, E = mc^2, where E denotes energy, m denotes mass, and c denotes the speed of light, along with Planck’s energy equation for waves, E=hf, which can also be written as E = hv/λ, where frequency of the wave is equal to the velocity of the wave’s propagation divided by the wavelength. On account of his hypothesis equalizing the properties of particles and waves, de Broglie equated Einstein’s mass-energy equivalence equation with Planck’s energy equation; however, in the mass-equivalence equation, he replaced the speed of light c with some velocity v since particles in the real world approach but do not travel at the speed of light. With some mathematical manipulations, he solved the equation mv^2 = hv/λ for λ and arrived at the equation λ=h/(mv), where h denotes Planck’s constant, m denotes mass, and v denotes the velocity of the wave’s propagation. This became known as the de Broglie wavelength, marking the wave-particle duality not only for light, but for all matter. Just like Planck, de Broglie’s accredited conjecture was to be confirmed by empirical evidence. In 1927, scientists Clinton J. Davisson and Lester H. Germer fired an electron gun onto nickel crystals and noticed diffraction akin to that of waves. More empirical evidence was acquired when George P. Thomson observed the same results after replicating the experiment but with a thin metal foil (4,5).

Other physicists wondered – what would it mean if all matter could be described as waves? Physicists theorized that the wave properties of matter meant that one could not pinpoint where exactly a particle was and how it would arrive at a particular place. Our understanding of measurement is about to be completely altered as we enter the quantum realm. How? By establishing a final, inescapable limit to measurement itself.


37 views0 comments


bottom of page