Skip to main content

Entropic origins of matter–antimatter asymmetry in a compact reheating universe

 


ABSTRACT


The observed dominance of matter over antimatter in the universe remains one of the most profound open questions in cosmology. Traditional explanations such as baryogenesis and leptogenesis invoke new physics beyond the Standard Model, relying on CP violation and out-of-equilibrium conditions, but often treat entropy as a passive constraint. In this work, we propose a novel mechanism in which entropic and statistical constraints during the reheating phase of the early universe actively drive the emergence of a matter–antimatter asymmetry. As the universe approaches thermal equilibrium and maximal entropy, the production and annihilation dynamics of particles become biased by constraints imposed due to entropy after the reheating phase of inflation. This scenario operates entirely within the bounds of established thermodynamics and quantum field theory, requiring no new physics. Our approach offers a fresh perspective on the origin of the cosmic matter–antimatter asymmetry and suggests new avenues for connecting statistical mechanics with fundamental cosmological observations.






Inflation 1 is a period of extremely rapid, accelerated expansion of space-time, during which the universe expands much faster than the speed of light (note: this is the expansion of space itself, not objects moving through space).  

- This inflationary phase smooths out the universe and stretches quantum fluctuations to cosmic scales.

- Inflation ends with a process called reheating (including preheating), during which the energy driving inflation is converted into a hot, dense plasma of particles.

- This process of particle production and thermalization marks the beginning of the hot Big Bang phase, from which the universe continues to expand and cool, eventually forming the structures we observe today.


Inflation rapidly expands the universe, ends with the creation and thermalization of particles, and sets the stage for the hot Big Bang.



Quantum vacuum fluctuations stretched by inflation become the seeds for galaxies. The inflaton decays through preheating (non-perturbative, rapid) and reheating (perturbative, slower), producing all Standard Model particles. After reheating, no new particles are produced from the inflaton, and the universe evolves through interactions among existing particles, with structure forming from the original density fluctuations.



 


Quantum Fluctuations and Inflation


   - Tiny quantum fluctuations of the vacuum exist everywhere.

   - During inflation, the universe expands exponentially, stretching these fluctuations to macroscopic (cosmic) scales.

   - As they are stretched beyond the horizon (faster than light due to space expansion), these fluctuations become "classical" and get imprinted as density variations.


End of Inflation: Particle Production


   - Preheating :The inflaton field (which drove inflation) oscillates and decays non-perturbatively, rapidly producing large numbers of particles (including Standard Model particles) through collective effects like parametric resonance.

   - Reheating: Any remaining inflaton energy decays perturbatively (slower, standard quantum decay), producing more particles and allowing the universe to thermalize into a hot, dense plasma.


After Reheating


   - Once reheating ends, the inflaton has decayed, and no new particles are produced from it.

   - The universe is filled with a hot soup of Standard Model particles.

   - Particle interactions (scattering, annihilation, decay) continue, but these just transform existing particles—they don’t create new ones from the inflaton.


Formation of Structure


   - The density variations ("frozen in" during inflation) serve as seeds for all cosmic structures.

   - Over time, gravity amplifies these small variations, leading to the formation of galaxies and larger structures as mass clusters in the denser regions.


 



Inflation → Reheating → Hot Big Bang (radiation-dominated universe, nucleosynthesis, cosmic microwave background, etc.)



During reheating the universe reaches a state of thermal equilibrium 


For a system in thermodynamic equilibrium, the entropy is maximized and remains constant for the given constraints 2.


- At equilibrium, all thermodynamic quantities—including entropy—are constant in time.

- The system’s entropy cannot increase further because it has already reached the maximum possible value allowed by the constraints.

- If the entropy could still increase, the system would not be at equilibrium by definition.


The state of the universe during reheating was of maximal entropy. 



During the reheating phase of the Big Bang, the second law of thermodynamics and thermalization imposed a maximum entropy constraint, driving asymmetric particle production (single particles or antiparticles) over symmetric particle-antiparticle pair production. We postulate that the second law and thermalization drove this asymmetric particle production during reheating, maintaining maximum entropy and ensuring a matter/antimatter excess.


(The fluctuation theorem indicates that local entropy decreases, are possible but exponentially unlikely, favoring processes that maintain or increase entropy.)


Asymmetric production avoids excessive entropy spikes from pair annihilation into photons and can also absorb photons to balance entropy fluctuations. This ensures a matter or antimatter excess persists, preventing a photon-dominated universe. The inflaton’s energy sustains particle production during thermalization, with asymmetric processes aligning with the second law to maintain maximum entropy. 


Post-thermalization, the universe’s entropy continues to increase via expansion and interactions, consistent with thermodynamic principles. This framework explains the observed matter-antimatter asymmetry and guarantees that complete annihilation into photons is avoided, ensuring the presence of matter or antimatter in the universe.



The pair production during the reheating stage was suppressed because their annihilation produces photons which have higher entropy due to a larger number of microstates available than fermions. 


Why do photons tend to have more microstates?


Bose-Einstein Statistics:


   - Photons are bosons: there's no limit on how many photons can occupy the same quantum state.

   - This leads to many more accessible microstates at a given energy — you can "pile up" photons into the same state, creating a large number of configurations.


Fermi-Dirac Statistics:


   - Baryons are fermions: subject to the Pauli exclusion principle, so no two identical baryons can occupy the same quantum state.

   - This severely restricts the number of microstates available to a system of baryons.



Degrees of Freedom:


   - Photons have 2 polarization states (even though they’re spin-1, they’re transverse), but still far more microstates because they can occupy any energy level.

   - Baryons have more internal structure, but due to their mass and quantum restrictions, this doesn’t translate to more thermal microstates at a given energy.


In the early universe:


- Photons outnumbered baryons by 1 billion to 1.

- Most of the universe’s entropy is carried by relativistic particles like photons (and neutrinos), not baryons.



The Big Bang should have produced equal amounts of matter and antimatter. This is clearly not the case and in our universe we observe dominance of matter over anti matter 3. To date this remains one of the great unsolved problems in physics .  Typically Baryogenisis and Leptogenisis are used to explain why matter dominates over anti matter 



- Baryogenesis refers to the generation of an asymmetry between baryons (particles like protons and neutrons) and antibaryons in the early universe, resulting in the excess of matter over antimatter we observe today 4.

- Leptogenesis specifically refers to the generation of an asymmetry between leptons (such as electrons and neutrinos) and antileptons. In many models, this lepton asymmetry is later partially converted into a baryon asymmetry through processes that violate both baryon and lepton number conservation, such as sphaleron transitions in the Standard Model 4.


Baryogenesis is about the direct creation of more baryons than antibaryons, while leptogenesis creates a lepton asymmetry first, which is then converted into a baryon asymmetry through known particle physics processes 4.



Both baryogenesis and leptogenesis require extensions to the Standard Model (SM) of particle physics. The SM alone does not provide sufficient sources of CP violation or mechanisms to generate the observed matter-antimatter asymmetry in the universe.


- Leptogenesis typically involves extending the SM by introducing heavy right-handed Majorana neutrinos (as in the seesaw mechanism) or other new particles, which decay out of equilibrium and violate lepton number and CP symmetry, generating a lepton asymmetry that is partially converted to a baryon asymmetry via sphaleron processes.

- Baryogenesis also generally requires new physics beyond the SM, such as additional Higgs fields, supersymmetry, or other mechanisms that can provide the necessary out-of-equilibrium conditions and sufficient CP violation to create a baryon asymmetry.


Neither baryogenesis nor leptogenesis can be fully realized within the Standard Model; both require new particles or interactions beyond those currently known.


Both Baryogenisis and Leptogenisis don't take the entropy of the universe into account. 


During the reheating phase as the universe reaches a thermal equilibrium asymmetrical production could be initiated by the presence of an isolated bayron or lepton(particle pairs move in an opp direction to conserve their momentum as they are created)


This happens because of the following reasons 


- As discussed above bosons have more microstates than fermions

- Particle anti particle pairs tend to annihilate leading to photon production. As the universe reaches maximal entropy this outcome becomes less likely. 

- A single particle skews the production to more particles of the same kind. This suppresses the probability of production of corresponding anti particles. For example an un annihilated electron may lead to production of more electrons. 

- To preserve charge equivalent amounts of quarks are produced ,if for example the initial seed is an electron or electrons are produced if the initial seed is a quark. Reverse could also be true if the initial seed is a positron or an anti quark.  

- These fermions maintain the maximal entropy by absorbing excess photons. 




Mechanism Details


Reheating introduces new thermodynamic constraints that were not active during preheating. Now the system has additional entropic constraints that it needs to account for. Any snapshot of the universe we take at this time would be in local thermodynamic equilibrium and in a state of maximal entropy. The universe continues to expand and increase its entropy but it remains maximised at any instant.  This tension leads to asymmetric particle production. 


   - Particle-antiparticle pairs are produced as usual during preheating, but their rapid separation (due to momentum conservation) prevents immediate annihilation.


   - Entropic effects that emerge during reheating suppress further pair production, as it would lead to annihilation and photon production, increasing entropy beyond the maximal value allowed in thermal equilibrium. Single particle production is favoured and their conjugate production becomes less likely. 



   - Newly produced particles are energized and occupy high-entropy states, with excess photons absorbed to stabilize entropy if it approaches or exceeds the maximum allowable value.


   - Pair production is disfavored because it leads to instability (via annihilation and photon production), while single-particle production maintains entropy and charge is balanced through production of oppositely charged particles that are not antiparticles.


Distinction from Baryogenesis/Leptogenesis


   - Unlike baryogenesis and leptogenesis, which require new physics (e.g., heavy Majorana neutrinos, extra Higgs fields, or supersymmetry), this model operates entirely within the SM, using thermodynamic and statistical principles.

   - This model tries to explain the matter–antimatter asymmetry “cleanly” by relying on entropy constraints, avoiding  new particles needed in traditional models.


Baryon-to-Photon Ratio:

   - The observed baryon-to-photon ratio (η ≈ 6 × 10⁻¹⁰) results from most particle-antiparticle pairs annihilating into photons, with excess single particles produced to prevent further annihilation and maintain maximal entropy.


Thermal Equilibrium and Entropy:


   - The universe reaches thermal equilibrium during reheating, implying maximal entropy at that stage. The fluctuation theorem ensures that entropy-decreasing processes (e.g., symmetric pair production) are statistically unlikely.

   - Photon absorption by newly produced particles prevents entropy from exceeding the maximum allowable value, stabilizing the system.



To get a sense of scale we begin by considering the size of the observable universe today -- 93 Billion light years across. At the end of inflation its scale factor was 10^-30 times smaller than what it is presently 5,6. If we plug in the numbers this gives us a size of about .88mm at the end of the reheating period. 


There are an estimated 10^97 particles in the observable universe  7. (Some estimates claim the number of particles to be 10^88). All of these particles had to be constrained in a volume that was just .88mm in radius. Even if we take eventual expansion into account this small volume leads to massive entropic constraints on the observable universe at the end of inflation changing the particle production behaviour. So the brief period when the universe was thermalised and the reheating was in effect these entropic constraints drove the particle production process. 


A comparison of bosonic and fermionic degrees of freedom in a relativistic plasma 8,9 such as during the end of the reheating period reveals that fermions have an entropy of about  2.701kb while bosons have entropy of about 3.602kb. 


During reheating, entropic constraints in thermal equilibrium favor single particle production (e.g., an electron) over particle-antiparticle pairs (e.g., electron-positron). Pair production is suppressed because annihilation causes quantized excitations in the electromagnetic (EM) field, producing photons and increasing entropy beyond the maximum value allowed. When the electron is emitted, for example, the system suppresses positron field excitation to prevent its emission thereby reducing the chance of annihilation that could cause excitations of the EM field leading to photon emission and hence stabilizing entropy.



During the reheating phase, photons were continuously absorbed and re-emitted as well as scattered—all these interactions were so frequent that photons couldn’t travel freely, making the universe effectively opaque.


This brings us to another important aspect. Charge  neutralisation. Why balance charges?  Why not simply produce particles of one type? All electrons or all up quarks? 


Although all fermions have the same entropy in a relativistic plasma domination by a single charge would have led to massive electrostatic repulsion producing photons via bremstrahllung,  electron electron collision resulting in an increase of entropy. For this reason charges were balanced with emission of particles of different charges that could not annihilate. For example electrons with quarks. 


Single particles absorb excess photons and maintain entropy, leading to the matter–antimatter asymmetry within the Standard Model.


This is also one of the reasons why charged particle production was favoured during this very brief phase. Because unlike neutrinos they have the capacity to absorb photons and maintain entropy. Neutrinos don't interact with the EM field at all. 


The universe was in local thermal equilibrium within the small volume (~0.88 mm), where entropy was maximized for the given energy and constraints, as per the second law and fluctuation theorem.Note that the reheating timescale is so short (10^-35 to 10^-32s)that expansion-driven entropy increase(10^-32 to 10^-10 seconds) is negligible during particle production, allowing the system to prioritize single particle production to avoid photon-driven entropy spikes.


In the brief period of reheating we got all of the excess matter particles that dominate the present universe. This model suggests that the dominance of matter over anti matter is not due to a finely tuned process but a randomness in a thermal bath that led to an isolated elementary particle shifting the pair production process to balance entropic and charge constraints. 



CONCLUSION


The persistent mystery of the universe’s matter–antimatter asymmetry has long motivated the search for new physics beyond the Standard Model. In this article, we have introduced a novel perspective: that entropic and statistical constraints during the reheating phase of the early universe may have played a direct, active role in generating the observed matter excess. By considering the maximization of entropy and the dynamics of particle production at thermal equilibrium, we propose that randomness—rather than new physics—could have seeded an initial imbalance, which was then amplified by entropic constraints to favor matter over antimatter. This approach is firmly grounded in established principles of thermodynamics and quantum field theory, and offers a compelling alternative to conventional baryogenesis and leptogenesis scenarios. While further quantitative modeling and comparison with observational data are needed, this entropic mechanism provides a promising new direction for understanding one of cosmology’s deepest puzzles, and highlights the profound interplay between statistical mechanics and the evolution of the early universe.




REFERENCES 


1 Introductory review of cosmic inflation


https://arxiv.org/abs/hep-ph/0304257


2 https://en.m.wikipedia.org/wiki/Entropy_(classical_thermodynamics)


3 Matter and Antimatter in the Universe


https://arxiv.org/abs/1204.4186



4  Baryogenesis and Leptogenesis


https://www.slac.stanford.edu/econf/C040802/papers/L018.PDF


5 Inflation and the cosmic microwave background 



https://arxiv.org/PS_cache/astro-ph/pdf/0305/0305179v1.pdf


6 Size of the universe after inflation 


https://physics.stackexchange.com/questions/32917/size-of-universe-after-inflation



7 Elementary particles 


https://en.m.wikipedia.org/wiki/Elementary_particle


8 On Effective Degrees of Freedom in the

Early Universe


https://arxiv.org/pdf/1609.04979


9 https://en.m.wikipedia.org/wiki/Ap%C3%A9ry's_constant

 


Comments

Popular posts from this blog

Why does collapsing a bubble with a sound wave produce light?

My thoughts on a reddit discussion  https://www.reddit.com/r/AskPhysics/comments/1lwxxc3/comment/n2jx8gp/?utm_source=share&utm_medium=mweb3x&utm_name=mweb3xcss&utm_term=1&utm_content=share_button The collapsing of a bubble with sound wave leads to the emission of light in a phenomenon known as sonoluminescnce.  The bubble collapse is rapid and the gas inside the core doesn't have time to exchange heat with the surroundings as it's compressed rapidly leading to what is known as adiabatic compression.  This compression heats up the gas to very high temp. The exact temperatures are inferred from the spectrum of emission which is thought to be a blackbody. But some sophisticated models have also been developed that put the temp in the range 5000k-20000k some even higher.  There's also debate on whether the bubble emission spectrum is truly a blackbody or is it line emission or bremsstrahlung? Personally I think its a mix of all three. The pressures create...

WeWork India Sustainability Summit 2025 Tackling Technical Challenges in Green Building Innovation

I thank we work India for organising sustainability summit 2025 to help drive real change towards decarbonising the commercial real estate sector. I gained valuable insights from the esteemed speakers especially around policy and regulation in this space.  My own thoughts kept pulling me towards some of the more technical challenges which are quite significant.  The current strategy of making buildings sustainable focuses on reducing the carbon footprint of a building during its operation and construction. In the operational stage the challenge is to ensure that the building can run on green energy. Heating and cooling are the heaviest users of energy and thus obvious targets for decarbonisation.  Since buildings these days scale vertically it's impossible to cover the energy requirements from rooftop solar panels. Unless solar panels can be installed vertically along the facade, the surface area would be too limited to generate any significant power. The idea has been tr...

Can you compress water and turn it solid?

A question asked on reddit https://www.reddit.com/r/askscience/comments/1n02vlg/ Yes and this has been experimentally confirmed. Shock compression of water has produced different forms of ice crystals.  SOME REFERENCES Experimental evidence for superionic water ice using shock compression https://www.nature.com/articles/s41567-017-0017-4 This particular form of ice melted at 5000K at 200Gpa.  https://www.llnl.gov/article/44081/first-experimental-evidence-superionic-ice An interesting tidbit from the research is in this paragraph  >Using diamond anvil cells (DAC), the team applied 2.5 GPa of pressure (25 thousand atmospheres) to pre-compress water into the room-temperature ice VII, a cubic crystalline form that is different from "ice-cube" hexagonal ice, in addition to being 60 percent denser than water at ambient pressure and temperature.  I'm not really sure at what temp this compression was performed but ice vii is known to exist at room temp at high enough pre...

Is there a future for materials science students in tribology?

My comments on a reddit discussion https://www.reddit.com/r/materials/comments/1nmooy5/comment/nfg6vub/ Tribology is a very important subfield of Mat sci and highly relevant anywhere there are moving parts. Like many other materials science domains its cross disciplinary and overlaps with automotive , aerospace ,manufacturing and even nano systems. I think its definitely worth studying and one should atleast  know about core concepts. From a purely research point of view the field is quite deep especially as it is being developed for nano systems and other emerging areas like triboluminescence. It does have a future. Wear is one of the major failure mechanism in materials and lots of resources are allocated to minimise it. Turbines,engine components, tyres ,cutting tools all suffer from wear and constant monitoring and refinement of process parameters is necessary.Many coatings are designed to reduce friction and wear Diamond like carbon films are cutting edge if you can build some...

What IMC 2025 Revealed About the State of Telecom

IMC 2025 lived up to its reputation as India's most anticipated communication event in India attracting big industry players—Intel,Qualcomm,Mediatek,Ericsson,Nokia along with research institutions and startups. All the 7 layers of the networking stack from the PHY to APPLICATION were well represented by various organisations.  Mobile operators serve as the face of the network but we often forget that they are powered by a long list of manufacturers and service providers. IMC gave them a platform to showcase their products and directly engage with customers.  5G is already here and very predictably there were talks around whether it has delivered on the promises it made. Speakers shared their thoughts and while the general consensus was that 5G did bring about somewhat faster speeds and a bit of lower latency the massive promises that it made especially around remote healthcare AR,VR and smart cities have all been forgotten.  mmwave is no where to be seen or even heard of....

Unlocking the Potential of Carbon for Long-Distance Electrical Transmission

ABSTRACT: We present a technique to manufacture large scale carbon based conductors for transmission of electrical energy over continental scale distances. We start by identifying precursors that could be used for production processes.We review the current manufacturing techniques of producing carbon based fibers and explain why certain precursors have dominated carbon materials industry. We identify methods that can be used to increase the yield through alternative precursors.We put forward a theory of why carbon conductors have less conductivity than metals and what can be done to improve it. Finally we postulate that with cheaper production methods even if carbon based conductors are 10 times less effective than poor metallic conductors like steel, they can still outperform them in High Voltage transmission lines if cheap manufacturing techniques could be developed.  INTRODUCTION: Copper and in certain very specific applications aluminium & silicon steels dominate when it co...

Do electrons really flow as a beam in cathode ray tubes?

  Abstract: It is generally well accepted that a beam of electrons flow from cathode to anode in a cathode ray tube. Taking pressure  data from a variety of sources from CRT manufacturers’  data sheets to engineering documents of large hadron colliders we show through calculations that there is enough residual gas in these devices to form a conducting path from anode to cathode due to plasma formation. When high voltages are applied at the anode the gas is ionized and becomes a plasma forming a ‘wire’ between the two electrodes that causes conduction of energy.  The objective of this brief note is to encourage scientists and engineers to re-investigate commonly accepted beliefs about vacuum tubes and develop new knowledge that can revitalize the field especially at a time when nano scaled vacuum channel transistors are being envisioned.  Most vacuum tubes have  operating pressures in the ultra high vacuum range. This is true for cathode ray tubes, vacuum tu...

Low energy fabrication of a high strength layered ceramic composite for high temperature oxidative environments

High temperature materials are required in various applications: in metallurgy, for making combustion chambers of internal combustion engines ,for the body of Stirling engines, for wall material of nuclear fusion reactors, for the body of Jet engines among a few.  For such applications we need materials that can retain their strengths at elevated temperatures and can survive in an oxidative environment  It is the second requirement which is more stringent. Although numerous metallic alloys have been synthesized that can sustain both high temperatures and oxygen attack they require complex processing steps  On the other hand ceramics are good at resisting both oxygen attacks and high temperatures and are relatively simpler to fabricate but are limited by massive amounts of energy required. For example c/sic composites will perform well in demanding high temperature oxidative environments but require vacuum to be manufactured. The acheson process for the formation of sic is...

Force calculations on electron in vacuum tubes

ABSTRACT A claim was made in the paper titled “Do electrons really flow as a beam in cathode ray tubes? ” where we asserted that electrons remain near the cathode surface during the operation of CRT. Here we do force calculations on electrons by estimating the debye length of electrons emitted after thermionic emission and show that under given applied voltages if electrons are placed at debye length they are sufficiently far away from the cathode surface to be accelerated towards it. Debye length, while typically used to measure charge screening distance in plasmas and electrolytes, can also be used to estimate the distance of emitted electrons from the cathode surface. In the same way debye length is used to calculate the thickness of an electrical double layer in which the surface charge and charge on inner helmholtz plane are immobile & the charges on outer helmholtz plane are mobile we can model emitted electrons as mobile charges & image charges distributed on the cathode...

Manufacturing technique for layered carbon /ceramic composite for use in high temperature oxidative environment

We previously described a layered carbon glass material system that is different from c/sic , c/sio2 matrix composites in that it consists of a distinct C/C phase which is coated by an sio2 layer.  https://akshatjiwannotes.blogspot.com/2024/12/low-energy-fabrication-of-high-strength.html This material system presents a distinct advantage in a high temperature oxidative atmosphere as the C/C matrix is protected by the oxidation resistant glass shield. Such a material can supposedly be synthesized in an open oxidative atmosphere. In this short note we will answer some questions such as  What manufacturing technique will be used?  How can silica particles be sintered on the substrate? How can adhesion between sintered particles and carbon substrate be ensured? What, if any ,sintering aids will be used? What would be the mechanical properties of the composite so formed?  What level of heat treatment will be required? To make the composite only the minimum amount of heat ...