Preprint
Article

This version is not peer-reviewed.

Metawar and the Future of Cognitive Sovereignty: Rethinking National Security Beyond Cyberspace

Submitted:

23 June 2025

Posted:

25 June 2025

You are already at the latest version

Abstract
In the 21st century, warfare has transcended conventional battlefields, evolving into cognitive and perceptual domains that challenge traditional security paradigms. Metawar—a disruptive concept articulated by Pitshou Moleka—refers to the total and adaptive conflict for controlling narratives, perceptions, attention, and social cohesion. This paper positions Metawar as a new domain of warfare, synergizing cyber, information, psychological, and algorithmic operations to dominate cognitive landscapes.Drawing on systems theory, neurocognitive science, and cybernetics, the study argues that national security must incorporate a new axis of cognitive sovereignty. Through comparative case studies—Ethiopia, Ukraine, and Eastern DRC—this work identifies how Metawar is already reshaping conflict dynamics across multiple theatres.It further outlines a doctrinal framework comprised of five pillars: anticipation, strategic narrative protection, coordinated counter-influence, cognitive recovery, and interagency integration. The article concludes with policy recommendations aimed at strengthening national defense architectures in Africa and beyond, including the creation of specialized Metawar units and regional coordination mechanisms.By redefining defense to include mental and informational resiliency, the paper proposes that states which fail to adapt risk being overpowered not by force of arms, but by stealthy, pervasive cognitive operations.
Keywords: 
;  ;  ;  ;  ;  ;  ;  ;  ;  ;  

1. Introduction

The nature of warfare is undergoing a profound transformation. While traditional military conflicts were once fought predominantly through kinetic means—land, sea, and air power—contemporary strategic environments increasingly emphasize non-kinetic dimensions. The advent of cyberspace introduced a fifth domain, but today’s conflicts extend beyond infrastructure attacks to target cognitive processes directly—ushering in what Pitshou Moleka (2025) defines as Metawar, or the battle for perception, belief, and reality.
Cognitive warfare, as conceptualized by NATO, seeks not only to influence what populations think, but how they think—targeting reasoning, decision-making, and trust (Bernal et al., 2020; Claverie & du Cluzel, 2022). This approach aligns with emerging U.S. military research into “neurowarfare,” which explores the use of neuropharmacology, brain–computer interfaces, and electromagnetic tools to impair enemy cognition or enhance soldier performance (Giordano et al., 2017; National Research Council, 2009). The DARPA Augmented Cognition program similarly explored real-time monitoring and manipulation of cognitive states (Schmorrow & McBride, 2004).
These developments pose unprecedented strategic challenges. As Zuboff (2019) warns, the rise of “surveillance capitalism” has enabled large-scale psychological profiling and micro-targeted disinformation. AI and deepfake technologies now facilitate personalized narrative distortion, amplifying emotional manipulation via social media algorithms (Pujol, 2025). At the same time, DARPA-funded experiments on neuroweapons reveal a willingness among some actors to weaponize fear, disorientation, and psychological distress (Wired, 2010; Wired, 2012).
Metawar thus transcends cyberwarfare by targeting the cognitive architecture of societies. It manipulates attention, memory, and trust to degrade social cohesion and undermine national resilience (Taddeo & Floridi, 2018). This requires a redefinition of national security, from the protection of physical boundaries to the defense of cognitive sovereignty—a nation’s authority over its collective perception and decision-making processes (Moleka, 2025).

2. From Cyberwar to Metawar—A Conceptual Shift

2.1. The Rise and Limitations of Cyberwarfare

Early scholarship on cyberwarfare—Clarke and Knake’s Cyber War (2010) and Rid’s Cyber War Will Not Take Place (2013)—emphasized kinetic-like attacks in cyberspace: infrastructures, networks, and data systems as primary targets. These thinkers focused on technical capabilities and legal frameworks to define acts of war in the digital domain. However, such a lens falls short of accounting for today’s battles waged over public opinion, narratives, and attention (Clarke & Knake, 2010; Rid, 2013).

2.2. Information Warfare and Cognitive Pre-Conditioning

The field evolved to include information warfare, involving disinformation campaigns and propaganda. NATO’s cognitive warfare framework (Bernal et al., 2020; Claverie & du Cluzel, 2022) underscores that conflict is no longer limited to attacking systems, but extends to “how people think.” However, information warfare often treats cognition as a “target” rather than a theatre of operations, remaining largely reactive and informational rather than adaptive or systemic.

2.3. Introducing Metawar: A Meta-Domain of Strategic Conflict

Metawar, as conceptualized by Moleka (2025), represents a significant conceptual leap. Rather than fragmenting warfare into discrete domains, Metawar is a meta-domain that integrates cyber, information, psychological, algorithmic, and neurocognitive operations into a single, adaptive strategic architecture. It goes beyond attacking systems or content—it restructures perception, reality, and collective meaning. Moleka (2025) argues:“Metawar is the war for the architectures of cognition and narrative, where the mind itself becomes the battlefield”.

2.4. Characteristics of Metawar

Moleka’s framework identifies four defining characteristics of Metawar:
  • Simultaneity: Multi-vector campaigns against cognitive, digital, and narrative targets operate in parallel.
  • Adaptivity: AI-driven systems adjust messaging and tactics in real time based on audience response.
  • Opacity: Actors and attribution are obscured through layers of bots, proxies, and false narratives.
  • Totality: Every facet of society—from institutions to individuals—is part of the operational environment (Moleka, 2025).
Together, these features signal a shift away from siloed cyber- or information warfare towards a unified, meta-layer conflict.

2.5. Systems Theory and Complexity in Strategic Warfare

Modern warfare resembles complex adaptive systems, where relationships are non-linear, and small inputs can generate large, unpredictable outcomes. This complexity underscores the allure of Metawar, whose adaptive, multi-domain strategies thrive in such environments. Scholars like Taddeo and Floridi (2018) and Binnendijk and Marler (2021) argue that effective metaphysical warfare must conceptualize conflict as dynamic, emergent, and systemic.

3. The Neurocognitive Foundations of Metawar

The emergence of Metawar is closely tied to breakthroughs in neuroscience and artificial intelligence, which have created highly effective tools for influencing human memory, emotion, and behavior. This section examines how these neurocognitive mechanisms are leveraged as instruments of strategic coercion.

3.1. Neuroweapons and Cognitive Targeting

Giordano et al. (2017) warn of the rise of “neuroweapons”—technologies designed to influence brain function, emotional states, and decision-making processes. Such tools include psychoactive agents, electromagnetic signals, and even app-based cognitive nudges. Although their deployment in real-world conflicts remains speculative, the potential for state actors to exploit these systems as part of Metawar is becoming increasingly plausible.

3.2. AI-Driven Attention Capture and Tailored Disinformation

The algorithms that power social media platforms and news feeds are optimized for engagement—but this dynamic can be weaponized. By delivering highly personalized content—tuned to a user’s psychological profile and emotional triggers—platforms can steer attention and perception at scale (Taddeo & Floridi, 2018). This micro-targeting goes far beyond traditional propaganda, creating feedback loops that calibrate messaging to an individual’s psychological profile.

3.3. Emotion-Driven Propaganda and Memory Manipulation

Research shows that emotional and affective content is far more memorable and influential than factual or neutral content (Slater & Sanchez-Vives, 2016; Zuboff, 2019). This insight is exploited in Metawar: armies of bots and coordinated campaigns saturate social media with emotionally evocative material—fear, outrage, hope—to fracture collective memory, sway public opinion, and impair trust in institutions.

3.4. Immersive Technologies as Cognitive Environments

The rise of virtual and augmented reality (VR/AR) represents a new frontier. Slater and Sanchez-Vives (2016) highlight the “presence effect,” whereby immersive simulations elicit authentic emotional and sensory responses. These technologies can be harnessed to create false yet emotionally convincing environments, or to influence perception through subtly manipulated immersive scenarios that embed ideological biases.

3.5. Surveillance Capitalism and Emotional Profiling

Shoshana Zuboff (2019) coined the term “surveillance capitalism” to describe the monetization of personal data and behavioral patterns. This creates a fertile ground for Metawar: corporations and states can jointly build emotional-propensity profiles to target ideological vulnerabilities. When combined with neurocognitive insights, these profiles enable operations designed to disorient and fragment social trust.

4. Case Studies: Ethiopia, Ukraine, and Eastern DRC

To understand Metawar’s real-world dynamics, this section examines three distinct theaters—Ethiopia, Ukraine, and Eastern Democratic Republic of Congo—where cognitive operations have played pivotal roles in conflict dynamics.

4.1. Ethiopia: Disinformation and Psychological Disruption in Tigray

During the Tigray conflict, both Ethiopian federal forces and the Tigray People’s Liberation Front (TPLF) employed wide-reaching disinformation campaigns. The Eritrean and Ethiopian administrations reportedly engaged in state-sponsored efforts to manipulate social media narratives, crafting deepfake content and deploying bot networks that spread fear and confusion (Puybareau, 2022). Civil society testimonies and OSINT investigations revealed targeted WhatsApp messages aimed at sowing discord and demoralizing Tigrayan communities, effectively fragmenting trust in local leadership (Claverie & du Cluzel, 2022).
This distortion of collective narratives had strategic consequences, influencing combatant morale, refugee flows, and international media responses—hallmarks of Metawar as described by Moleka (2025).

4.2. Ukraine: Cognitive Dominance and Narrative Resilience

Since the Russian invasion in 2022, Ukraine has endured sophisticated campaigns of hybrid warfare combining cyberattacks, disinformation, and psychological operations (Rid, 2020). Russian state media and affiliated troll farms bombarded Ukrainian and international social media with false claims designed to weaken national resolve, obscure battlefield realities, and shape external perceptions.
Ukraine responded with robust narrative defense efforts—fact-checking platforms, citizen-first info hubs (e.g., Diia), and the integration of “info-patriots” within military and civil institutions. These measures demonstrated resilience and adaptability, reflecting aspects of a national cognitive sovereignty doctrine.
This engagement aligns with Moleka’s strategic vision: defense operations today must defend a nation’s mind-space, not just its physical boundaries.

4.3. Eastern DRC: Narrative Subversion Amid Armed Conflict

In Eastern DRC, groups like the M23 and their regional backers have systematically used digital media to influence public perception. Dissident narratives—assembled via WhatsApp, Facebook, and radio—solidify support among communities while undermining national authority (Puybareau, 2022). These actors push selective narratives claiming national army (FARDC) abuses and ethnic threats, inflaming tensions and pursuing strategic psychological conflict.
Often, national media and civil society have been slow to mount counter-narratives, allowing M23-aligned messaging to persist and distort civic understanding. This grey-zone information warfare, involving manipulation of real-time events, synthetic visuals, and rumor engineering, fits squarely within the contours of Metawar.

4.4. Comparative Analysis of Metawar in Practice

Each instance demonstrates Metawar’s principles:
  • Multi-vector adaptability
  • Emergence in digitally connected societies
  • Narrative control as strategic leverage
These case studies underscore the potency of cognitive conflict. Ethiopia reveals the use of deep-tech disinformation. Ukraine demonstrates resilience through narrative defense. Eastern DRC illustrates low-tech, yet effective, cognitive subversion within fragile state contexts—all validating Moleka’s conceptualization of Metawar.
Table 1. Caption.
Table 1. Caption.
Case Actors Cognitive Tactics Impact
Ethiopia Federal govt & TPLF Deepfakes, bot-driven WhatsApp info Demoralization, trust breakdowns
Ukraine Russia vs Ukraine Troll farms, cyber + disinfo ops Mobilization, narrative defense in action
Eastern DRC M23, FARDC, NGO media Rumor ops, radio/WhatsApp pushes Local legitimacy, institutional fatigue

5. Cognitive Sovereignty and National Security Doctrine

As Metawar expands its reach across digital and psychological spaces, nations must reimagine security beyond kinetic defense. This section explores the concept of cognitive sovereignty, compares Western doctrinal frameworks and Moleka’s strategic model, and proposes a cognitive-first defense doctrine suited to 21st-century threats.

5.1. Defining Cognitive Sovereignty

Cognitive sovereignty refers to a state’s capacity to control and shield its collective cognitive environment—that is, the narratives, perceptions, and decision-making processes of its population. As Moleka (2025) articulates: “Control of a nation’s mind-space is as vital as control of its territory; sovereignty now lies in the cognitive domain.”
This requires state mechanisms for narrative integrity, media hygiene, and public resilience against manipulative operations.

5.2. Western Approaches to Cognitive Warfare Doctrine

NATO’s exploration of cognitive warfare underscores its multidimensional nature, spanning AI, neurotech, and strategic communications (Bernal et al., 2020; Claverie & du Cluzel, 2022). Similarly, Binnendijk and Marler (2021) emphasize cognition as both a target and battlefield within emerging defense architectures. While these doctrines lay a conceptual foundation, they often presume homogeneous, robust societies and do not address vulnerabilities specific to developing states.

5.3. Moleka’s Cognitive Doctrine for Emerging States

Moleka (2025) adapts cognitive warfare principles for contexts characterized by state fragility and sociopolitical diversity. His five-pillar framework includes:
  • Anticipation – Early-warning systems using AI to detect manipulative campaigns.
  • Strategic Narrative Protection – Curating and defending official narratives in media ecosystems.
  • Coordinated Counter-Influence – Rapid, agile response teams aligned across institutions.
  • Cognitive Recovery – Public trust restoration via information correction and psychological support.
  • Interagency Integration – Institutional cooperation across defense, intelligence, and civil society.
This doctrine stresses transparency, ethical boundaries, and civic engagement, aiming to preserve democratic norms while resisting cognitive attacks.

5.4. Institutional Architecture for Cognitive Defense

Practical implementation of cognitive sovereignty requires:
  • A National Cognitive Defense Command – centralizing strategy, communications, and intelligence.
  • Media literacy departments – at the education ministry, boosting resilience from youth.
  • Live monitoring units – for real-time detection and counter-narrative deployment.
  • Legal mechanisms – to regulate disinformation without suppressing dissent.
Countries such as Lithuania have pioneered innovative approaches in countering hybrid threats (European Parliament, 2020), offering templates for emerging states

5.5. Ethical, Legal, and Operational Safeguards

Defensive cognitive measures carry risks—censorship, political misuse, or societal manipulation. Moleka (2025) proposes:
  • Strict legal limits, independent oversight (judiciary or civil society bodies)
  • Exemptions for academic and journalistic freedoms
  • Mechanisms ensuring defensive, not offensive, use of cognitive tools
These align with broader principles in AI ethics and democratic governance (Floridi & Cowls, 2019).
By reframing cognitive sovereignty as central to modern defense, this section bridges theoretical insights and practical frameworks. It shows how emerging powers can design proactive, ethical strategies to safeguard national consciousness in the face of Metawar.

6. Operational Framework for Metawar Defense

To effectively counter the multidimensional threats of Metawar, states require a structured defense framework incorporating anticipatory, protective, responsive, and integrative measures. This section outlines a five-pillar doctrine, along with recommended organizational models to operationalize cognitive sovereignty.

6.1. Pillar I: Anticipation—Early-Warning and Detection

Objective: Identify emerging cognitive threats before they escalate.
  • AI-powered monitoring of social media, messaging apps (e.g., WhatsApp), and alternative platforms to detect shifts in narrative sentiment or bot activity (Taddeo & Floridi, 2018).
  • Collaborations with academic institutions and cybersecurity firms to map ideological influence campaigns.
  • Use of sentinel indicators—surges in emotive content, deepfake incidents, or coordinated troll activity—as warning signs (Puybareau, 2022).
Institutional suggestion: A National Cognitive Threat Center under cybersecurity or defense leadership, staffed with data scientists, linguists, and psychologists.

6.2. Pillar II: Strategic Narrative Protection

Objective: Maintain integrity and resonance of state narratives.
  • Regularly updated content validated by fact-checking, open-source intelligence (OSINT), and third-party review.
  • Integrated official messaging across television, radio, social media, and community networks.
  • Strategic partnerships with diaspora journalists and influencers to extend narrative reach (Zuboff, 2019).
Institutional suggestion: A Strategic Narrative Unit within the communications ministry or national broadcasting authority.

6.3. Pillar III: Coordinated Counter-Influence

Objective: Mount activе and agile responses to cognitive incursion.
  • Deploy rapid reaction cell(s) with legal authority and interagency support to rebut disinformation in near-real-time.
  • Employ narrative framing, contextual corrections, and emotional resilience campaigns.
  • Launch digital literacy initiatives (e.g., hackathons, civic labs) to inoculate citizens against manipulation (Floridi & Cowls, 2019).
Institutional suggestion: A Counter-Influence Task Force with members from military, intelligence, foreign affairs, academia, civil society, and tech sectors.

6.4. Pillar IV: Cognitive Recovery and Adaptation

Objective: Repair social trust and reinforce resilience post-attack.
  • Psychological support for communities affected by narrative crises (e.g., trauma debriefings).
  • Public transparency reports on disinformation incidents and countermeasures.
  • Continuous evaluation and calibration of narrative strategies—assessing resonance, emotional impact, and effectiveness.
Institutional suggestion: Regional Recovery Cells nested within local governance structures.

6.5. Pillar V: Interagency Integration and Governance

Objective: Ensure cohesive and effective structural response.
  • Establish a National Cognitive Security Council chaired by senior defense or security official, with cross-ministry representation.
  • Develop a Cognitive Defense Strategy, updated biennially, encompassing legal, ethical, technological, and financial dimensions.
  • Promote international sharing of threat intelligence and best practices through AU, UN, or bilateral mechanisms.
Institutional suggestion: Embed liaison positions in key ministries (defense, intel, media regulation, education) within the Cognitive Security Council.

6.6. Organizational Archetypes

Countries like Lithuania, the UK, and Singapore have pioneered multi-agency structures tackling narrative and influence threats (European Parliament, 2020; Pollitt, 2023). Emerging states may adopt streamlined cabinet-level units or hybrid models tailored to resource constraints and governance systems.
Table 2. Implementation Roadmap (Timeline).
Table 2. Implementation Roadmap (Timeline).
Phase Actions Duration
Phase 1 Initial capacity audit; legal framework drafting 0–3 months
Phase 2 Pilot threat monitoring and rapid-cell operation 3–9 months
Phase 3 Scale to national roll-out; interagency doctrine finalized 9–18 months
Phase 4 Full operational capability; regional coordination begins 18–36 months
By systematizing the five-pillar framework and leveraging contemporary best practices, nations can operationalize Metawar defense—protecting human cognition as a strategic domain. The next section will present policy-level recommendations for embedding this framework into national and regional security architectures.

7. Policy Recommendations for African and Global Defense Systems

A robust defense posture against Metawar requires strategic action across policy, institutional design, education, and international cooperation. These recommendations aim to assist African governments and global partners in operationalizing cognitive sovereignty:

7.1. Institutionalize Metawar Doctrine and Capacity

  • National Cognitive Defense Doctrine: Governments should formalize a doctrine integrating the five-pillar framework (anticipation, narrative protection, counter-influence, recovery, interagency), aligned with security, legal, and ethical norms.
  • Dedicated cyber-cognitive agencies: Establish ministerial-level units with cross-cutting authority, staffed with cybersecurity experts, psychologists, data scientists, linguists, and civil society liaisons.
  • Legislative foundation: Draft legal frameworks specifying permissible measures, oversight structures, and safeguards to prevent misuse, drawing on models such as Lithuania’s Center for Defence of Internet Freedom (European Parliament, 2020).

7.2. Enhance Technological and Human Capabilities

  • AI and neurodiversity research: Fund interdisciplinary centers combining neuroscience, AI, and media studies to refine early-warning systems and resilience interventions.
  • Social media collaboration: Formalize partnerships with platforms (e.g., Meta, X) to share data, co-develop detection protocols, and curate content pipelines for rapid narrative corrections.
  • Training and education: Integrate cognitive warfare modules into military colleges and intelligence academies; support continuous training for journalists and civil society in media verification and counter-messaging.

7.3. Promote Media and Digital Literacy

  • National literacy campaigns: Deploy school-to-community programs teaching citizens how to identify misinformation, algorithmic manipulation, and sample cognitive biases.
  • Partnership with educational institutions: Integrate critical thinking and digital literacy into curricula across secondary and tertiary levels, supported by UNESCO’s frameworks (UNESCO, 2021).
  • Public resilience initiatives: Launch citizen-led “digital resilience hubs” to encourage communal fact-checking and evidence-based dialogue.

7.4. Foster Regional and Multilateral Cooperation

  • Regional Metawar coalitions: Through AU or ECOWAS mechanisms, create joint early-warning platforms, shared narrative toolkits, and coordinated rapid-counter cells.
  • Global cognitive security forums: Expand dialogues in the UN, OSCE, and African Union to address cross-border influence, disinformation and cognitive security.
  • Technical partnerships: Encourage Africa–EU–US collaboration on R&D for deepfake detection, adversarial AI, and attribution methods.

7.5. Uphold Ethical, Legal, and Human Rights Standards

  • Surveillance accountability: Mandate transparency measures—with judicial review and civil-society oversight—for all cognitive surveillance activities.
  • Privacy safeguards: Enact data protection regulations limiting data collection and algorithmic profiling, in alignment with international standards like GDPR.
  • Ethical offensive constraints: Prohibit state-sponsored offensive cognitive operations targeting citizens or foreign publics—aligning with emerging cyber norms and Geneva Conventions protocols.

7.6. Invest in Monitoring, Reporting & Evaluation

  • Cognitive incident registry: Centralize documentation of manipulative campaigns, public censorship events, and corrective interventions.
  • Performance dashboards: Employ metrics like misinformation spread, trust indices, and restoration times to evaluate effectiveness.
  • Continuous adaptation: Maintain regular review cycles to incorporate new tactics, address emerging threats, and refine strategic posture.

References

  1. Bernal, A., Carter, C., Singh, I., Cao, K., & Madreperla, O. (2020). Cognitive warfare: An attack on truth and thought. Johns Hopkins University.
  2. Binnendijk, A., & Marler, M. (2021). Cognitive Warfare and Military Strategy. RAND Corporation.
  3. Clarke, R. A., & Knake, R. K. (2010). Cyber War: The Next Threat to National Security and What to Do About It. Ecco.
  4. Claverie, B., & du Cluzel, F. (2022). “Cognitive Warfare”: The advent of the concept of “cognitics” in the field of warfare. In B. Claverie, B. Prébot, N. Buchler, & F. du Cluzel (Eds.), Cognitive Warfare: The Future of Cognitive Dominance (pp. 1–7). NATO Collaboration Support Office.
  5. DARPA. (2007). Cognitive Technology Threat Warning Systems (CT2WS). Kirkpatrick, D. BAA 07-25.
  6. European Parliament. (2020). Information warfare in cyber domain: A threat to democracy (Study PE 659.034). Brussels: European Parliament.
  7. Floridi, L., & Cowls, J. (2019). A Unified Framework of Five Principles for AI in Society. Harvard Data Science Review.
  8. Giordano, J., et al. (2017). The neuroweapons threat. Bulletin of the Atomic Scientists, 73(6), 326–334.
  9. McBride, D., & Schmorrow, D. (2004). Augmented cognition: improving warfighter information intake under stress. International Journal of Human–Computer Interaction.
  10. Métaguerre : Comment Gagner les Guerres du FuturVerlagGRIN.
  11. National Research Council. (2009). Opportunities in neuroscience for future Army applications. Washington, DC: The National Academies Press.
  12. Pollitt, B. (2023). Nation-state responses to hybrid threats: Emerging practices. Institute for Strategic Studies.
  13. Pujol, I. (2025). Cognitive warfare: The new battlefield exploiting our brains. Polytechnique Insights.
  14. Puybareau, C. (2022). Les nouvelles guerres de l’information. Paris: VA Press.
  15. Rid, T. (2020). Active Measures: The Secret History of Disinformation and Political Warfare. Farrar, Straus and Giroux.
  16. Schmorrow, D., & McBride, D. (2004). Augmented cognition. International Journal of Human–Computer Interaction.
  17. Slater, M., & Sanchez-Vives, M. V. (2016). Enhancing our lives with immersive virtual reality. Frontiers in Robotics and AI, 3, 74.
  18. Taddeo, M., & Floridi, L. (2018). How AI can be a force for good. Science, 361(6404), 751–752. [CrossRef]
  19. UNESCO. (2021). Education for Sustainable Development and Digital Citizenship. Paris: UNESCO.
  20. Wired. (2010, November). Air Force looks to artificially overwhelm enemy cognitive capabilities.
  21. Wired. (2012, December). Hacking the human brain: The next domain of warfare.
  22. Zuboff, S. (2019). The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. PublicAffairs.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated