Computer Science and Mathematics

Sort by

Article
Computer Science and Mathematics
Geometry and Topology

Aymane Touat

Abstract:

We study a purely local inverse problem for non-reversible Randers metrics \( F = \|\cdot\|_g + \beta \) defined on smooth oriented surfaces. Using only the lengths of sufficiently small closed curves around a point \( p \), we prove that the exterior derivative \( d\beta(p) \) can be uniquely and stably recovered. Moreover, we establish that \( d\beta(p) \) is the only second-order local invariant retrievable from such local length measurements. Our approach is entirely metric-based, independent of geodesic flows or boundary data, and naturally extends to general curved surfaces.

Concept Paper
Computer Science and Mathematics
Geometry and Topology

Amir Hameed Mir

Abstract: We present the Atemporal Tablet Framework (ATF), a complete geometric ontology that derives spacetime, quantum mechanics, and gravity from a single mathematical structure. The universe is modeled as a fiber bundle T ->(π) M where T is a static higher-dimensional manifold and M is emergent 3+1D spacetime. Temporal dynamics arise from projection operators Πt : T -> M extremizing a projective action SΠ. Quantum states are epistemic distributions over fibers, with the Born rule emerging naturally via measure disintegration. Measurement corresponds to topological phase-locking without wavefunction collapse. Einstein’s equations arise as equations of motion for Πt, while quantum fields emerge as fiber vibrations. The framework makes specific testable predictions: sidereal anisotropy in qubit decoherence ε = 1.23 × 10^-8 ± 3 × 10^-9 (derived from holographic scaling) and modified dispersion relations at scale EP / sqrt(ε). We prove a reconstruction theorem establishing that spacetime observations can determine the underlying geometry, and demonstrate that Standard Model particle content emerges naturally from Fx ≅ CP3 × S5 / Γ fiber geometry. ATF provides a mathematically rigorous, experimentally falsifiable foundation for quantum gravity that resolves long-standing interpretational issues while making concrete predictions testable with current technology.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Suraj Arya

,

Nisha Soni

,

Sahimel Azwal Bin Sulaiman

,

Dedek Andrian

Abstract: Fruits are an integral part of our diet. Various types of proteins and vitamins are obtained through fruits. Apple is a major fruit that is consumed globally. This is a multipurpose fruit that is used in the preparation of various food products and also in medicines. Therefore, it is important to analyze its future prices. India is the largest producer of apples, thus it is very important to analyze the Apple prices of Indian agricultural markets. Machine learning and deep learning models have not been previously applied to this Indian dataset. Various time series models like Long Short-Term Memory (LSTM), SARIMA, and ETS are developed, but the performance of LSTM is much better compared to the other models, with the lowest error rates (MAE of 554.08, RMSE of 752.10, 191, and MAPE of 6.63 percent). Thus, the proposed study provides the solution to a real-life problem, which ultimately can be used for agriculture policy making and smart market strategies.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Kieran Greer

Abstract:

This paper describes a new auto-associative network called a Unit-Merge Network. It is so-called because novel compound keys are used to link 2 nodes in 1 layer, with 1 node in the next layer. Unit nodes at the base store integer values that can represent binary words. The word size is critical and specific to the dataset and it also provides a first level of consistency over the input patterns. A second cohesion network then links the unit nodes list, through novel compound keys that create layers of decreasing dimension, until the top layer contains only 1 node for any pattern. Thus, a pattern can be found using a search and compare technique through the memory network. The Unit-Merge network is compared to a Hopfield network and a Sparse Distributed Memory (SDM). It is shown that the memory requirements are not unreasonable and that it has a much larger capacity than a discrete Hopfield network, for example. It can store sparse data, deal with noisy input and a complexity of O(log n) compares favourably with these networks. This is demonstrated with test results for 4 benchmark datasets. Apart from the unit size, the rest of the configuration is automatic, and its simplistic design could make it an attractive option for some applications.

Concept Paper
Computer Science and Mathematics
Computer Science

Abhigyan Mukherjee

Abstract: The increasing adoption of contactless transactions has introduced new security challenges, particularly in protecting data from unauthorized access and fraudulent activities. This study presents a novel authentication framework designed to enhance the security of near-field communication (NFC) transactions while maintaining efficiency for real-time processing. By leveraging a lightweight cryptographic approach, the proposed system mitigates threats such as relay attacks and unauthorized skimming. Experimental analysis demonstrates improvements in authentication reliability and resistance to common attack vectors. This research contributes to the development of more secure and scalable contactless payment solutions.

Concept Paper
Computer Science and Mathematics
Information Systems

Abhigyan Mukherjee

Abstract: With the growing reliance on peer-to-peer (P2P) networks for digital transactions, traditional electronic payment systems require enhancements to ensure security, efficiency, and trust. This study introduces an innovative digital payment framework enabling currency-based exchanges between consumers and vendors within a peer-to-peer environment. The outlined approach is inspired by Millicent’s scrip methodology and leverages digital envelope encryption to bolster protection. Unlike conventional payment methods that heavily rely on financial institutions, the protocol minimizes their involvement, restricting their role to trust establishment and transaction finalization. The system introduces a distributed allocation model, where merchants locally authorize payments, reducing transaction overhead and enhancing scalability. Additionally, the protocol is optimized for repeated payments, making it particularly efficient for recurring transactions between the same buyer and merchant. By integrating cryptographic techniques and decentralizing payment authorization, this protocol presents a secure, efficient, and scalable solution for digital payments in P2P environments.

Concept Paper
Computer Science and Mathematics
Computer Science

Abhigyan Mukherjee

Abstract: Near Field Communication (NFC) technology is increasingly being integrated into mobile devices, enabling applications such as contactless payments and public transportation access. This paper investigates the security architecture of NFC systems, focusing on mobile device implementations and the vulnerabilities they introduce. Various configurations for NFC’s Secure Element (SE), such as SD cards, multiple UICC slots, and shared SIM resources, are discussed, highlighting potential security challenges related to relay attacks, malware distribution, differential power analysis, and denial-of-service attacks. In particular, relay attacks and malware distribution are identified as significant threats that could compromise user security during transactions. The paper further explores countermeasures like two-factor authentication, distance-bounding protocols, and defensive cryptographic techniques to mitigate these risks. Additionally, it emphasizes the complexities introduced by trust issues between Mobile Network Operators (MNOs) and thirdparty providers in sharing secure resources. Finally, the research suggests that while NFC itself is relatively secure, applications built on top of this infrastructure are more prone to security risks. As NFC technology continues to evolve, ensuring robust security for its applications, particularly in the financial and healthcare sectors, will be critical to its widespread adoption.

Concept Paper
Computer Science and Mathematics
Security Systems

SravanaKumar Nidamanooru

Abstract: Identity and Access Management (IAM) increasingly relies on adaptive controls—step-up challenges, recovery verification, device and behavior signals, and continuous authorization—to reduce account takeover and misuse. At the same time, IAM systems must prepare for post-quantum cryptography (PQC) transitions that affect credentials, signing, and verification paths. These shifts create a practical governance problem: when an identity action is allowed, challenged, denied, or escalated (e.g., passwordless enrollment, recovery credential release, privileged step-up, or machine key rotation), teams must be able to explain why the decision happened, what evidence was considered, and how the decision can be independently verified later. This paper introduces Decision Receipts (DR): a verifiable, privacy-aware record produced at decision time that captures (i) policy context and versioning, (ii) normalized evidence descriptors (not raw personal data), (iii) action outcomes and reason codes, and (iv) cryptographic signatures supporting long-term auditability under PQC. We propose an open receipt schema, canonicalization rules, and verifier workflows using widely deployed identity standards (OAuth 2.0, OpenID Connect, JWT) and modern signing containers (JWS/COSE), with optional anchoring into transparency logs for tamper-evidence. The approach is intentionally IP-safe and adoptable as an audit overlay independent of any specific orchestrator implementation.

Article
Computer Science and Mathematics
Algebra and Number Theory

Li An-Ping

Abstract: There are added some matters for the estimation of \( H(n,m) \) in the appendix.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Sijia Li

,

Yutong Wang

,

Yue Xing

,

Ming Wang

Abstract: This work addresses correlation bias and causal effect confounding in advertising recommendation systems and presents a causal learning–based recommendation framework. We first examine the limitations of conventional recommendation algorithms in complex advertising environments, where confounding variables and exposure bias often prevent models from capturing users’ true preferences. To tackle these issues, we design a unified embedding architecture that jointly represents user, advertisement, and contextual features, and incorporates a structural causal graph to explicitly model dependencies among variables. During model training, causal consistency regularization and inverse propensity weighting are integrated to mitigate the impact of biased exposure mechanisms and non-uniform sampling. A joint optimization objective is further formulated to couple click-through rate prediction with causal consistency estimation, enabling robust causal effect learning without sacrificing predictive accuracy. Extensive experiments on large-scale advertising datasets demonstrate that the proposed approach consistently outperforms several representative baselines in terms of Precision@10, Recall@10, NDCG@10, and MAP, while exhibiting strong robustness under multi-dimensional sensitivity analysis. Overall, this study highlights the practical value of causal modeling and consistency-aware learning in advertising recommendation and offers a computationally grounded approach for improving both interpretability and fairness in recommendation systems.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Al Khan

Abstract: The rapid evolution of data-driven fields demands educational paradigms that transition from static analysis to dynamic interaction with live information. This paper presents a novel technical framework, the Dual-Agent Curator-Tutor (DACT), which integrates Artificial Intelligence as a concurrent Real-Time Data Curator and Interactive Tutor within Immersive Analytics (IA) learning environments. The DACT framework features two synergistic AI agents: a Curation Agent that dynamically ingests, filters, and contextualizes live data streams (e.g., IoT, financial feeds) for pedagogical alignment, and a Tutoring Agent that provides adaptive, scaffolded instruction based on multimodal analysis of learner behavior within an immersive visualization space (VR/AR). This creates a closed-loop ecosystem where the data landscape and instructional guidance co-adapt in real-time to the learner’s actions. We detail a modular architecture implementing this model, utilizing perturbation-based learning for adaptive curation—inspired by recent optimization techniques—and a rule-based pedagogical engine. We propose a rigorous quantitative evaluation methodology involving controlled experiments to measure gains in analytical proficiency, cognitive load reduction, and behavioral patterns. The paper argues that this seamless integration of automated data management and personalized tutoring within an immersive context represents a transformative advancement for experiential learning, effectively leveraging technology to offload cognitive overhead and elevate higher-order analytical reasoning skills.

Article
Computer Science and Mathematics
Security Systems

Marco Rinaldi

,

Elena Conti

,

Giovanni Ferraro

Abstract: Traditional kernel fuzzers rely on coarse-grained coverage metrics that cannot reflect complex microarchitectural behaviors. We present a hardware-assisted fuzzing framework that leverages branch buffer telemetry from modern CPUs (LBR, BTB sampling) to refine fuzzing feedback. A model-based inference algorithm aggregates branch-data patterns to estimate microarchitectural novelty and guides seed prioritization. Experiments on Intel Ice Lake and AMD Zen 3 systems demonstrate 27% improvement in unique path coverage, with 11 newly identified concurrency bugs across filesystem and scheduler subsystems. Compared with coverage-only fuzzing, our method reduces time-to-crash by 46% while keeping overhead below 12%. This work shows microarchitectural-level signals can significantly boost kernel fuzzing’s effectiveness.

Article
Computer Science and Mathematics
Information Systems

Hyunwoo Choi

,

Jisoo Han

,

Minseo Park

Abstract: This study develops an adaptive workflow allocation mechanism for anti-money laundering (AML) operations, aiming to improve the accuracy and efficiency of suspicious-transaction review. A multi-agent simulation platform was constructed to model transaction flows, alert generation, and analyst decision behaviors. The system integrates model-confidence estimation, analyst-fatigue prediction, and real-time workload signals to dynamically route alerts. Experiments were conducted using 27.3 million historical transactions and 186,000 alerts from a large commercial financial dataset. Compared with fixed allocation rules, the adaptive mechanism increased alert-escalation precision from 0.32 to 0.46 and recall from 0.70 to 0.78, while reducing average handling time by 19.4%. The proportion of high-risk alerts processed within the target time window improved by 23.8%. These results demonstrate that workflow optimization can meaningfully enhance AML performance beyond model-level improvements.

Short Note
Computer Science and Mathematics
Computer Vision and Graphics

Marcus Elvain

,

Howard Pellorin

Abstract: Generative models have achieved remarkable success in producing realistic images and short video clips, but existing approaches struggle to maintain *persistent worldcoherence over long durations and across multiple modalities. We propose Multimodal Supervisory Graphs (MSG), a novel framework for world modeling that unifies geometry (3D structure), identity (consistent entities), physics (dynamic behavior), and interaction (user/agent inputs) in a single abstract representation. MSG represents the environment as a dynamic latent graph, factorized by these four aspects and trained with cross-modal supervision from visual (RGB-D), pose, and audio streams. This unified world abstraction enables generative AI systems to maintain consistent scene layouts, preserve object identities over time, obey physical laws, and incorporate interactive user prompts, all within one model. In our experiments, MSG demonstrates superior long-term coherence and cross-modal consistency compared to state-of-the-art generative video baselines, effectively bridging the gap between powerful short-term video generation and persistent, interactive world modeling. Our framework outperforms prior methods on metrics of identity consistency, physical plausibility, and multi-view geometry alignment, enabling new applications in extended reality and autonomous agent simulation.

Short Note
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Jori Winslett

,

Taryn Ellsworthy

,

Callan Everhart

Abstract: Traditional e-commerce visualization relies on static 3D spins or pre-rendered videos, which fail to convey material properties such as stiffness, flexibility, or sole compression. This lack of tactile feedback creates a "trust gap" for online buyers. In this paper, we introduce PhysiGen, a "What If?" viewer that allows users to apply virtual forces to product models. Leveraging a novel Action-Conditional Reconstruction technique, our system utilizes a physics-informed world model to generate short video sequences of deformation (e.g., shoe twisting, foam compression) based on user input. We demonstrate that this approach significantly increases buyer confidence by bridging the "tactile gap" in online shopping, achieving a 45% increase in user engagement compared to static viewers.

Article
Computer Science and Mathematics
Analysis

Branko Sarić

Abstract: On the basis of the isomorphic algebraic structures of the field of complex numbers ℂ and the 2-dimensional Euclidean field of real vectors V₂, in terms of identical geometric products of elements, this paper brings integral identities for scalar and vector fields in V₂, which are vector analogues of the well-known integral identities of complex analysis. Consequently, in this paper, Theorem 1., which is a generalized fundamental theorem of integral calculus in the field V₂, is the vector analogue of the Cauchy theorem of complex analysis. Therefore, special attention is paid to the vector analogue of Cauchy's calculus of residues in the field V₂. Finally, at the very end of the paper, the algebraic structure of the 3D field of vectors V₃ is presented, as well as the corresponding fundamental integral identities.

Short Note
Computer Science and Mathematics
Computer Vision and Graphics

Brennan Sloane

,

Landon Vireo

,

Keaton Farrow

Abstract: High-fidelity telepresence requires the reconstruction of photorealistic 3D avatars in real-time to facilitate immersive interaction. Current solutions face a dichotomy: they are either computationally expensive multi-view systems (e.g., Codec Avatars) or lightweight mesh-based approximations that suffer from the "uncanny valley" effect due to a lack of high-frequency detail. In this paper, we propose Mono-Splat, a novel framework for reconstructing high-fidelity, animatable human avatars from a single monocular webcam video stream. Our method leverages 3D Gaussian Splatting (3DGS) combined with a lightweight deformation field driven by standard 2D facial landmarks. Unlike Neural Radiance Fields (NeRFs), which typically suffer from slow inference speeds due to volumetric ray-marching, our explicit Gaussian representation enables rendering at >45 FPS on consumer hardware. We further introduce a landmark-guided initialization strategy to mitigate the depth ambiguity inherent in monocular footage. Extensive experiments demonstrate that our approach outperforms existing NeRF-based and mesh-based methods in both rendering quality (PSNR/SSIM) and inference speed, presenting a viable, accessible pathway for next-generation VR telepresence.

Short Note
Computer Science and Mathematics
Computer Vision and Graphics

Landon Vireo

,

Brennan Sloane

,

Arden Piercefield

,

Greer Holloway

,

Keaton Farrow

Abstract: Diminished Reality (DR)—the ability to visually remove real-world objects from a live Augmented Reality (AR) feed—is essential for reducing cognitive load and decluttering workspaces. However, existing techniques face a critical challenge: removing an object creates a visual void ("hole") that must be filled with a plausible background. Traditional 2D inpainting methods lack temporal consistency, causing the background to flicker or slide as the user moves. In this paper, we propose Clean-Splat, a novel framework for real-time, multi-view consistent object removal. We leverage 3D Gaussian Splatting (3DGS) for scene representation and integrate a View-Consistent Diffusion Prior to hallucinate occluded background geometry and texture. Unlike previous NeRF-based inpainting which is prohibitively slow, our method updates the 3D scene representation in near real-time, enabling rendering at >30 FPS on consumer hardware. Extensive experiments on real-world cluttered scenes demonstrate that Clean-Splat achieves state-of-the-art perceptual quality (LPIPS) and temporal stability compared to existing video inpainting approaches.

Article
Computer Science and Mathematics
Software

Michael Dosis

,

Antonios Pliatsios

Abstract: This paper presents Sem4EDA, an ontology-driven and rule-based framework for automated fault diagnosis and energy-aware optimization in Electronic Design Automation (EDA) and Internet of Things (IoT) environments. The escalating complexity of modern hardware systems, particularly within IoT and embedded domains, presents formidable challenges for traditional EDA methodologies. While EDA tools excel at design and simulation, they often operate as siloed applications, lacking the semantic context necessary for intelligent fault diagnosis and system-level optimization. Sem4EDA addresses this gap by providing a comprehensive ontological framework developed in OWL 2, creating a unified, machine-interpretable model of hardware components, EDA design processes, fault modalities, and IoT operational contexts. We present a rule-based reasoning system implemented through SPARQL queries, which operates atop this knowledge base to automate the detection of complex faults such as timing violations, power inefficiencies, and thermal issues. A detailed case study, conducted via a large-scale trace-driven co-simulation of a smart city environment, demonstrates the framework’s practical efficacy: by analyzing simulated temperature sensor telemetry and Field-Programmable Gate Array (FPGA) configurations, Sem4EDA identified specific energy inefficiencies and overheating risks, leading to actionable optimization strategies that resulted in a 23.7% reduction in power consumption and 15.6% decrease in operating temperature for the modeled sensor cluster. This work establishes a foundational step towards more autonomous, resilient, and semantically-aware hardware design and management systems.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Tekin Ahmet Serel

,

Esin Merve Koç

,

Oğuz Uğur Aydın

,

Eda Uysal Aydın

,

Furkan Umut Kılıç

Abstract: Placental abruption is detachment of the placenta before delivery from the implantation site that may have a potential to develop life-threating emergency clinic syptoms. The multifactorial nature of this disorder and no lab testing or procedures that can diagnose placental abruption. makes it difficult to predict. Artificial intelligence (AI) and machine learning (ML) have the potential to enhance clinical decision-making and enable precise assessments. This study purposed on predictive 15 ML models for placental abruption high-lighting input characteristics, performance metrics, and validation. The medical records of 564 patients were analyzed between 2021 and 2025 for studies using AI to develop predictive models for placental abruption. Findings were analyzed with Python software and Pycaret library. The model integrated data for 5 variables (features) for the prediction. Among 15 machine learning algorithms, Logistic regression was chosen as the best model. The performance metrics were determined as follows: accuracy rate of 0.85, AUC of 0.91, recall of 0.85, precision of 0.85, and F1 score of 0.85. In the ranking based on their importance in the classification model, gestational age at delivery was observed to have the highest importance for classification. Twenty-eight unseen cases were utilized for an extra validation step. The model achieved a high accuracy on this set, with 21 cases correctly predicted. The presented 15 ML models in our study had significant accuracy in predicting placental abruption , but these models require further development before they can be applied in a clinical setting.

of 633

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated