From entropy to ZENtropy: a new frontier
Entropy has long stood as one of the most profound and unifying concepts in science. From Clausius’s 19th-century formulation of the second law of thermodynamics to Boltzmann’s statistical interpretation, Gibbs’s classical statistical mechanics, and Shannon’s information theory, entropy has shaped our understanding of order, disorder, and the flow of energy and information. These milestones have influenced physics, chemistry, biology, and modern data science, providing a foundation for analyzing systems from molecules to information networks to black holes.
Clausius introduced entropy as a measure of irreversibility, a principle that governs the direction of natural processes[1]. His insight, that heat flows spontaneously from hot to cold, was not merely a statement about energy but a profound observation about the arrow of time. Boltzmann deepened this understanding by linking entropy to the number of microscopic configurations, giving rise to the famous equation engraved on his tombstone[2]. Gibbs[3] extended these ideas into ensemble theory, laying the groundwork for statistical mechanics and thermodynamics as we know them today[4,5]. Shannon’s leap into information theory transformed entropy into a measure of uncertainty, bridging physics and communication science[6]. These conceptual revolutions made entropy a universal language for complexity, yet they were conceived in an era when systems were relatively simple compared to the intricate networks and phenomena we study today based on quantum mechanics[7].
Quantum systems exhibit entanglement and coherence that defy traditional thermodynamic intuition. Biological networks adapt and self-organize in ways that cannot be captured by simple ensembles. Artificial intelligence (AI) introduces learned configurations that evolve dynamically, raising questions about how entropy interacts with algorithms and data-driven models. The limitations and isolation of classical entropy become evident when we attempt to predict emergent behaviors across scales, from atomic vibrations to planetary systems.
This is where zentropy theory enters[8-11]. Zentropy redefines entropy as more than a measure of configuration probability. It introduces a framework where total entropy includes not only Gibbs and Shannon contributions but also the statistically weighted entropy of each configuration. Density functional theory (DFT)[12,13] defines the ground-state configuration of a system, providing a practical solution of quantum mechanics and enabling the zentropy theory as a practical solution for quantum statistical mechanics, i.e., a simplified quantum statistical mechanics[14]. Based on DFT, zentropy theory provides an approach for predictive modeling of emergent behaviors through quantum mechanics. When direct quantum treatment becomes intractable due to the complexity, AI based on zentropy-enhanced neural networks (ZENN) enables the learning of configurations[15]. By bridging rigorous theory with computational and data-driven approaches, zentropy offers a pathway to understand complexity as a continuum rather than a collection of isolated phenomena, exemplifying the recursive nature of entropy[16].
The ZENtropy journal is founded to advance this frontier. Its mission is to explore complexity through the lens of entropy and uncover how order emerges from disorder across scales, from atomic vibrations to macroscopic organization. It seeks to illuminate mechanisms of transformation, adaptation, and self-organization that underpin the behavior of materials, chemical systems, Earth systems, coupled human and natural systems, and beyond. What sets ZENtropy apart is its commitment to predictive science and integrative methodologies. Contributions that combine rigorous theory, computational modeling, and experimental validation are encouraged across both physical and social domains. Whether configurations are derived from quantum mechanics, learned through AI, or synthesized from hybrid approaches, the journal seeks work that challenges conventional paradigms and provides actionable insights into emergent phenomena across disciplines.
Entropy’s influence spans an extraordinary range of domains. In physics, it governs phase transitions, quantum entanglement, and cosmological evolution. Black hole entropy, introduced by Bekenstein and Hawking[17-19], revealed that even the most enigmatic objects in the universe obey thermodynamic principles. The detection of gravitational waves confirmed Einstein’s predictions and opened a new observational window into the cosmos, where entropy and information are intertwined with spacetime dynamics. These discoveries underscore entropy’s role as a bridge between fundamental forces and emergent phenomena, linking the microscopic and cosmic scales. They remind us that entropy is not confined to laboratories or equations; it governs the evolution of stars, galaxies, and the universe itself. Recent developments in holographic principles suggest that entropy may hold the key to unifying quantum mechanics and gravity, pointing toward a deeper theory of spacetime.
In materials science, entropy shapes free-energy landscapes, stabilizes complex alloys, and drives transport in energy storage systems[20]. It explains why certain phases emerge under extreme conditions and how disorder can enhance functionality. High-entropy alloys, for example, exploit configurational entropy to achieve remarkable strength and corrosion resistance. In chemistry and biology, entropy governs molecular organization, protein folding, and adaptive networks that sustain life. Drug design leverages entropy-based models to predict binding affinities and optimize therapeutic molecules. In computer science and AI, entropy informs algorithms for uncertainty quantification and guides machine learning models that predict emergent behaviors in complex datasets. Space science explores entropy in gravitational systems, black holes, and the thermodynamics of the universe, offering clues to the ultimate fate of cosmic evolution. Applied fields such as finance and social science use entropy to analyze uncertainty and complexity in human systems, from market dynamics to social networks.
Zentropy’s relevance extends beyond physical and chemical systems into the realm of neuroscience and medicine. Alzheimer’s disease, a complex neurodegenerative disorder, exemplifies the challenges of modeling systems with vast configurational spaces. Traditional approaches struggle to capture the interplay of molecular, cellular, and systemic factors that drive disease progression. Zentropy theory offers a new perspective by incorporating both the entropy of individual configurations and the statistical entropy across configurations, enabling predictive modeling of emergent behaviors in the brain. In contrast, ZENN[15] supports the development of digital twins for personalized Alzheimer’s disease models, paving the way for improved prevention strategies and targeted treatments. By bridging thermodynamics, AI, and neuroscience, ZENtropy demonstrates its potential to transform healthcare through predictive science.
ZENtropy and open science are inseparable[21]. The complexity of modern challenges demands transparency, reproducibility, and global collaboration. ZENtropy aligns with FAIR principles (Findable, Accessible, Interoperable, and Reusable) and supports open access and dissemination to ensure that knowledge is freely available to all. It embraces the spirit of the United Nations Educational, Scientific and Cultural Organization (UNESCO)’s recommendations on open science, promoting shared data, interoperable tools, and inclusive participation. Authors are encouraged to publish workflows, datasets, and computational models alongside their research, enabling validation and extension across disciplines. This commitment to openness is not only ethical but essential for accelerating discovery in a world where complexity demands collective effort. ZENtropy’s approach resonates with global initiatives such as Horizon Europe and the International Science Council’s vision for open knowledge, positioning the journal as a leader in collaborative science.
Education and workforce development are central to ZENtropy’s vision. Building global Science, Technology, Engineering, and Mathematics (STEM) capacity requires more than technical training; it demands interdisciplinary thinking and cultural openness. ZENtropy encourages curricula that integrate thermodynamics, data science, and systems theory, preparing students to tackle complexity with predictive tools. It advocates for international partnerships that provide equitable access to resources and mentorship, ensuring that talent from all regions can contribute to and benefit from the ZENtropy movement. By fostering a new generation of scientists and engineers fluent in both theory and computation, ZENtropy aims to create a workforce capable of addressing global challenges collaboratively. This commitment extends to capacity building in emerging economies, where access to advanced tools and knowledge can empower local innovation and contribute to global resilience.
Science today faces unprecedented challenges, from sustainable energy and climate resilience to quantum technologies and intelligent materials. Addressing these requires not only incremental advances but transformative frameworks that unify knowledge across disciplines. ZENtropy aspires to be a catalyst for this transformation. By fostering open collaboration and predictive modeling, it seeks to accelerate breakthroughs that will shape the future of technology, sustainability, and human progress. The vision is a scientific ecosystem where complexity is not a barrier but a source of innovation, a world where entropy-driven insights guide the design of resilient infrastructures, adaptive materials, and intelligent systems that serve humanity. Imagine energy systems optimized through entropy landscapes, quantum devices stabilized by predictive thermodynamics, and AI frameworks that learn from nature’s entropy-driven evolution. These are achievable goals when theory, computation, and experiment converge under a unified paradigm.
Beyond the journal, ZENtropy represents a global movement for open science and interdisciplinary collaboration. It is a call to scientists, engineers, and innovators to break down silos and embrace complexity as a shared frontier. This movement envisions international networks of researchers working together on entropy-driven challenges, leveraging open data and transparent methodologies to accelerate progress. It seeks to democratize knowledge, ensuring that breakthroughs in predictive modeling and emergent behavior are accessible to all, from academic institutions to industry and policymakers. ZENtropy is not just a framework; it is a philosophy that complexity can be understood, predicted, and harnessed for the benefit of humanity.
The societal implications are profound. Entropy-based models can inform climate strategies by predicting tipping points in Earth systems. They can guide the development of sustainable materials for energy storage and conversion, reducing dependence on scarce resources. In medicine, entropy-driven approaches can accelerate drug discovery and improve personalized therapies by modeling the complexity of biological networks. In finance and social science, entropy can help anticipate systemic risks and design resilient economic structures. These applications illustrate that ZENtropy is not confined to academic discourse; it is a practical framework for addressing global challenges.
Looking ahead, ZENtropy has the potential to ignite future scientific revolutions. Just as thermodynamics transformed engineering and information theory reshaped communication, ZENtropy can redefine how we approach complexity in the 21st century and beyond. It offers a foundation for breakthroughs in quantum computing, sustainable technologies, and intelligent systems that adapt and evolve. By uniting theory, computation, and experiment under a common vision, ZENtropy can become a cornerstone of predictive science, a guiding principle for innovation across disciplines and across borders. It is not merely a journal; it is the seed of a paradigm shift that will influence research, education, and policy of the future.
Welcome to ZENtropy. The journey begins here.
DECLARATIONS
Authors’ contributions
Prepared the draft: Liu, Z. K.
All co-authors read, revised and approved the final version of the manuscript.
Availability of data and materials
Not applicable.
Financial support and sponsorship
None.
Conflicts of interest
Liu, Z. K. is Editor-in-Chief of the journal ZENtropy. Ågren, J.; Lu, K.; Navrotsky, A.; Pedrycz, W.; Perdew, J. P.; Xie, J.; Zhang, T. are Advisory Board Members of ZENtropy. Dong, H.; Hao, W.; Iorio, C. S.; Ivory, S. J.; Li, Z.; Jiang, W.; Miao, D.; Sun, J.; Sun, Z.; Xiang, Y.; Xu, W. are Deputy Editors-in-Chief of ZENtropy. Chong, X.; Davey, T.; Wang, W. Y.; Zhong, Y. are Executive Editors of ZENtropy. None of the above-mentioned editors were involved in any part of the editorial process, including reviewer selection, manuscript handling, or decision-making.
Ethical approval and consent to participate
Not applicable.
Consent for publication
Not applicable.
Copyright
© The Author(s) 2026.
REFERENCES
1. Clausius, R. Ueber verschiedene für die Anwendung bequeme Formen der Hauptgleichungen der mechanischen Wärmetheorie. Annalen. der. Physik. 1865, 201, 353-400. (In German).
2. Boltzmann, L. Weitere Studien über das Wärmegleichgewicht unter Gasmolekülen. Sitzungsberichte der Kais. Akad. der. Wissenschaften,. Wien. 1872, 66. , 275-370. (in German). https://books.google.com/books?id=Fmy5PgAACAAJ. (accessed 26 December 2025).
3. Gibbs, J. W. Elementary Principles in Statistical Mechanics: Developed with Especial Reference to the Rational Foundation of Thermodynamics; Charles Scribner’s Sons, 1902.
4. Gibbs, J. W. The collected works of J. Willard Gibbs: Vol. II Elementary Principles in Statistical Mechanics; Yale University Press, Vol. II, 1948. https://www.amazon.com/Collected-Works-Willard-Gibbs-Vol/dp/1528262360. (accessed 2025-12-26).
5. Gibbs, J. W. Elementary Principles in Statistical Mechanics: Developed with Especial Reference to the Rational Foundation of Thermodynamics; Cambridge University Press, 2011.
6. Shannon, C. E. A mathematical theory of communication: Part I and II. Bell. Syst. Tech. J. 1948, 27, 379-423.
7. Schrödinger, E. Quantisierung als Eigenwertproblem. Annalen. der. Physik. 2006, 384, 361-76. (in German).
8. Wang, Y.; Hector, L. G.; Zhang, H.; Shang, S. L.; Chen, L. Q.; Liu, Z. K. Thermodynamics of the Ceγ-α transition: density-functional study. Phys. Rev. B. 2008, 78, 104113.
9. Wang, Y.; Hector Jr, L. G.; Zhang, H.; Shang, S. L.; Chen, L. Q.; Liu, Z. K. A thermodynamic framework for a system with itinerant-electron magnetism. J. Phys:. Condens. Matter. 2009, 21, 326003.
10. Liu, Z. K.; Li, B.; Lin, H. Multiscale entropy and its implications to critical phenomena, emergent behaviors, and information. J. Phase. Equilib. Diffus. 2019, 40, 508-21.
11. Liu, Z. K.; Wang, Y.; Shang, S. Zentropy theory for positive and negative thermal expansion. J. Phase. Equilib. Diffus. 2022, 43, 598-605.
12. Wang, Y.; Liu, Z. K.; Chen, L. Thermodynamic properties of Al, Ni, NiAl, and Ni3Al from first-principles calculations. Acta. Mater. 2004, 52, 2665-71.
13. Shang, S.; Wang, Y.; Kim, D.; Liu, Z. K. First-principles thermodynamics from phonon and Debye model: application to Ni and Ni3Al. Comput. Mater. Sci. 2010, 47, 1040-8.
14. Perdew, J. P. SCAN meta-GGA, strong correlation, symmetry breaking, self-interaction correction, and semi-classical limit in density functional theory: Hidden connections and beneficial synergies? APL. Computational. Physics. 2025, 1, 010903.
15. Wang, S.; Shang, S. -L.; Liu, Z. K.; Hao, W. ZENN: a thermodynamics-inspired computational framework for heterogeneous data-driven modeling. Proc. Natl. Acad. Sci. 2025, 123, e2511227122.
16. Myers, L. A.; Hew, N. L. E.; Shang, S. -L.; Liu, Z. K. Recursive entropy in thermodynamics: establishing the statistical-physics basis of the zentropy approach.
17. Bekenstein, J. D. Generalized second law of thermodynamics in black-hole physics. Phys. Rev. D. 1974, 9, 3292-300.
20. Liu, Z. K.; Hew, N. L. E.; Shang, S. Zentropy theory for accurate prediction of free energy, volume, and thermal expansion without fitting parameters. Microstructures 2024, 4, 2024009.
21. ZENN on Github. https://github.com/WilliamMoriaty/ZENN. (accessed 2025-12-26).
Cite This Article
How to Cite
Download Citation
Export Citation File:
Type of Import
Tips on Downloading Citation
Citation Manager File Format
Type of Import
Direct Import: When the Direct Import option is selected (the default state), a dialogue box will give you the option to Save or Open the downloaded citation data. Choosing Open will either launch your citation manager or give you a choice of applications with which to use the metadata. The Save option saves the file locally for later use.
Indirect Import: When the Indirect Import option is selected, the metadata is displayed and may be copied and pasted as needed.
About This Article
Copyright
Data & Comments
Data





Comments
Comments must be written in English. Spam, offensive content, impersonation, and private information will not be permitted. If any comment is reported and identified as inappropriate content by OAE staff, the comment will be removed without notice. If you have any queries or need any help, please contact us at [email protected].