Engineering

Sort by

Article
Engineering
Telecommunications

Stefano Cunietti

,

Víctor Monzonís Melero

,

Chiara Sammarco

,

Ilaria Ferrando

,

Domenico Sguerso

,

Juan V. Balbastre

Abstract: In urban environments, the accuracy of traditional Global Navigation Satellite System (GNSS) could be compromised due to signal occlusion and multipath interference, particularly during critical operational phases such as drone take-off and landing. This study seeks to enhance drone positioning accuracy by integrating 4G and 5G communication antennas and applying multilateration (MLAT) techniques based on Time-Difference-of-Arrival (TDOA) and Angle of Arrival (AOA) measurements. The research focuses on a real-world case study in the metropolitan area of Valencia, Spain, where extensive mobile network data were analysed using the Cramér-Rao Lower Bound (CRLB) to identify zones with minimal positioning errors and, separately, optimal coverage for connectivity. The results suggest that integrating terrestrial antennas could enhance drone navigation; however, its current applicability remains limited to urban areas.

Article
Engineering
Telecommunications

Anoush Mirbadin

Abstract: This paper aims to maximize the information transmission rate by eliminating channel redundancy while still enabling reliable recovery of uncoded data. It is shown that parallel message-passing decoders can recover uncoded transmitted bits by increasing only the receiver-side computational complexity. In the proposed architecture, the $k$ transmitted information bits are embedded into a higher-dimensional linear block code at the receiver, and appropriately valued log-likelihood ratios (LLRs) are assigned to the parity positions. One-shot parallel decoding is performed across all hypotheses in the codebook, and the final decision is obtained by minimizing an orthogonality-based energy criterion between the decoded vector and the complement of the code space. For a fixed $(8,24)$ linear block code, the decoding behavior is investigated as a function of the parity-bit LLR magnitude. Increasing the parity LLR magnitude introduces an artificial reliability that improves hypothesis separation in the code space and yields a sharper waterfall region in the bit-error-rate (BER) curves. This increase in parity LLR also induces a systematic rightward shift of the BER curves, which does not correspond to a physical noise reduction and must therefore be compensated for fair performance comparison. After proper compensation, it is observed that increasing the parity LLR improves decoding performance up to a point where it can surpass the performance of conventional LDPC decoding with iterative processing. In principle, arbitrarily strong decoding performance can be approached by increasing the parity LLR magnitude; however, the maximum usable value is limited by numerical instabilities in practical message-passing implementations. Overall, the results demonstrate that strong decoding performance can be achieved without transmitting redundancy or employing high-dimensional coding at the transmitter, relying instead on receiver-side processing and controlled parity reliability over an additive white Gaussian noise (AWGN) channel.

Article
Engineering
Telecommunications

Xiuxia Cai

,

Chenyang Diwu

,

Ting Fan

,

Wenjing Wang

,

Jinglu He

Abstract: Remote sensing image super-resolution (RSISR) aims to reconstruct high-resolution images from low-resolution observations of remote sensing data to enhance the visual quality and usability of remote sensors. Real world RSISR is challenging owing to the diverse degradations like blur, noise, compression, and atmospheric distortions. We propose hierarchical multi-task super- resolution framework including degradation-aware modeling, dual-decoder reconstruction, and static regularization-guided generation. Speciffcally, the degradation-wise module adaptively characterizes multiple types of degradation and provides effective conditional priors for reconstruction. The dual-doder platform incorporates both convolutional and Transformer branches to match local detail preservation as well as global structural consistency. Moreover, the static regularizing guided generation introduces prior constraints such as total variation and gradient consistency to improve robustness to varying degradation levels. Extensive experiments on two public remote sensing datasets show that our method achieves performance that is robust against varying degradation conditions.

Article
Engineering
Telecommunications

Yasir Al-Ghafri

,

Hafiz M. Asif

,

Zia Nadir

,

Naser G. Tarhuni

Abstract: In this paper, a wireless network architecture is considered that combines double Intelligent Reflecting Surfaces (IRSs), Energy Harvesting (EH), and Non-Orthogonal Multiple Access (NOMA) with cooperative relaying (C-NOMA), to leverage the performance of Non-Line-of-Sight (NLoS) communication and incorporate energy efficiency in next-generation networks. To optimize the phase shifts of both IRSs, we employ a machine learning model that offers a low-complexity alternative to traditional optimization methods. This lightweight learning-based approach is introduced to predict effective IRS phase configurations without relying on solver-generated labels or repeated iterations. The model learns from channel behaviour and system observations, which allows it to react rapidly under dynamic channel conditions. Numerical analysis demonstrates the validity of the proposed architecture in providing considerable improvements in terms of spectral efficiency and service reliability through the integration of energy harvesting and relay-based communication, compared to conventional systems, thereby facilitating green communication systems.

Article
Engineering
Telecommunications

Giuseppina Rizzi

,

Vittorio Curri

Abstract: The constant growth of IP data traffic, driven by sustained annual increases surpassing 26%, is pushing current optical transport infrastructures towards their capacity limits. Since the deployment of new fiber cables is economically demanding, ultra-wideband transmission is emerging as a promising costly-effective solution, enabled by multi-band amplifiers and transceivers spanning the entire low-loss window of standard single-mode fibers. In this scenario, an accurate modeling of the frequency-dependent fiber parameters is essential to reliably model optical signal propagation. In particular, the combined impact of attenuation slope and inter-channel stimulated Raman scattering (SRS) fundamentally shapes the power evolution of wide wavelength division multiplexing (WDM) combs and directly affects nonlinear interference (NLI) generation. In this work, a set of analytical approximations for the frequency-dependent attenuation and Raman gain coefficient is presented, providing an effective balance between computational efficiency and physical fidelity. Through extensive simulations covering C, C+L, and ultra-wideband U-to-E transmission scenarios, the accuracy in reproducing the behavior of the power evolution and NLI profiles of fully numerical SRS models with the proposed approximations is demonstrated.

Article
Engineering
Telecommunications

RAFE ALASEM

,

Mahmud Mansour

Abstract: Vehicle Ad-Hoc Networks (VANETs) face critical challenges in trust management, privacy preservation, and scalability, particularly with the integration of 5G networks in Intelligent Transportation Systems (ITS). Traditional centralized trust models present single points of failure and privacy concerns that compromise network security and user anonymity. This paper presents a novel decentralized trust model leveraging blockchain technology, Interplanetary File System (IPFS) integration, and post-quantum cryptographic algorithms to address these limitations. Our proposed TrustChain-VANETs framework implements advanced privacy-preserving encryption techniques including threshold and homomorphic encryption, geographical sharding for scalability, and edge-assisted consensus mechanisms. Performance evaluation demonstrates significant improvements: 40% reduction in authentication latency (90-120ms vs 150-300ms), 90% malicious node detection rate (+15% improvement), 300% increase in transaction throughput (2000-2150 TPS), and 100% scalability enhancement supporting up to 5000 nodes. The system integrates seamlessly with 5G network slicing (URLLC, eMBB, mMTC) while maintaining quantum resistance through CRYSTALS-Dilithium, KYBER, and FALCON algorithms. Real-world deployment considerations including OBU computational constraints, standardization gaps, and energy efficiency are comprehensively analyzed. Results indicate that the proposed decentralized approach provides robust security, enhanced privacy, and improved scalability for next-generation vehicular networks, making it suitable for large-scale ITS deployment.

Article
Engineering
Telecommunications

Rafe Alasem

Abstract: The rapid proliferation of smart cities and Intelligent Transportation Systems (ITS) demands revolutionary approaches to vehicular communications that can simultaneously address energy efficiency, security, and quality of service requirements. This paper presents GreenFlow VANET, a novel 5G-enabled secure and energy-efficient routing protocol specifically designed for smart city deployments. Our approach leverages a three-tier architecture integrating Vehicle Ad-Hoc Networks (VANETs) with 5G Ultra-Reliable Low-Latency Communication (URLLC), enhanced Mobile Broadband (eMBB), and massive Machine Type Communication (mMTC) network slices. The GreenFlow Secure Routing Protocol (GF-5G-SRP) introduces MEC-assisted route discovery, multi-criteria next-hop selection incorporating 5G quality metrics, and adaptive energy management techniques. Our security framework employs ECC-256 cryptography, ChaCha20-Poly1305 encryption, and distributed trust management to ensure robust protection against vehicular network threats while preserving location privacy through SUPI/SUCI mechanisms. Extensive simulations using NS-3 with 5G-LENA and SUMO mobility models demonstrate that GreenFlow VANET achieves 96.8% packet delivery ratio, 59% energy reduction compared to traditional approaches, and 81% improvement in network lifetime while maintaining sub-millisecond latency for safety-critical communications. The proposed solution effectively addresses the scalability challenges of dense urban vehicular networks with up to 1000 vehicles while providing comprehensive security with 97.8% attack detection rates and minimal false positives.

Review
Engineering
Telecommunications

Panagiotis K. Gkonis

,

Anastasios Giannopoulos

,

Nikolaos Nomikos

,

Lambros Sarakis

,

Vasileios Nikolakakis

,

Gerasimos Patsourakis

,

Panagiotis Trakadas

Abstract: The goal of the study presented in this work is to analyze all recent advances in the context of the computing continuum and meta–operating systems (meta-OSs). The term continuum includes a variety of diverse hardware and computing elements as well as network protocols, ranging from lightweight internet of things (IoT) components to more complex edge or cloud servers. To this end, the rapid penetration of IoT technology in modern era networks along with associated applications poses new challenges towards efficient application deployment over heterogeneous network infrastructures. These challenges involve among others the interconnection of a vast number of IoT devices and protocols, proper resource management, as well as threat protection and privacy preservation. Hence, unified access mechanisms, data management policies and security protocols are required across the continuum to support the vision of seamless connectivity and diverse device integration. This task becomes even more important as discussions on sixth generation (6G) networks are already taking place, which they are envisaged to coexist with IoT applications. Therefore, in this work the most significant technological approaches to satisfy the aforementioned challenges and requirements are presented and analyzed. To this end, a proposed architectural approach is also presented and discussed which takes into consideration all key players and components in the continuum. In the same context, indicative use cases and scenarios that are leveraged from a meta-OS in the computing continuum are discussed as well. Finally, open issues and related challenges are also discussed.

Article
Engineering
Telecommunications

Galia Marinova

,

Edmond Hajrizi

,

Besnik Qehaja

,

Vassil Guliashki

Abstract: This study presents the development of a smart microgrid control framework. The goal is to achieve optimal energy management and maximize photovoltaic (PV) generation utilization through a combination of optimization and reinforcement learning techniques. A detailed Simulink model is developed in MATLAB to represent the dynamic behavior of the microgrid, including load variability, temperature profiles, and solar radiation. Initially, a genetic algorithm (GA) is used to perform static optimization and parameter tuning – identifying optimal battery charging/discharging schedules and balancing power flow between buildings in the microgrid to minimize main grid dependency. After that a Soft Actor-Critic (SAC) reinforcement learning agent is trained to perform real-time maximum power point tracking (MPPT) for the PV system under different environmental (weather) and load conditions. The SAC agent learns from multiple (eight) simulated PV generation scenarios and demand profiles, optimizing the duty cycle of the DC-DC converter to adaptively maintain maximum energy yield. The combined GA-SAC approach is validated on a university campus microgrid consisting of four interconnected buildings with heterogeneous loads, including computer labs that generate both active and reactive power demands. The results show improved efficiency, reduced power losses, and improved energy autonomy of the microgrid, illustrating the potential of AI-driven control strategies for sustainable smart energy systems.

Article
Engineering
Telecommunications

Anastasia Daraseliya

,

Eduard Sopin

,

Julia Kolcheva

,

Vyacheslav Begishev

,

Konstantin Samouylov

Abstract: Modern 5G+grade low power wide area network (LPWAN) technologies such as Narrowband Internet-of-Things (NB-IoT) operate utilizing a multi-channel slotted ALOHA algorithm at the random access phase. As a result, the random access phase in such systems is characterized by relatively low throughput and is highly sensitive to traffic fluctuations that could lead the system outside of its stable operational regime. Although theoretical results specifying the optimal transmission probability that maximizes the successful preamble transmission probability are long known, the lack of knowledge about the current offered traffic load at the BS makes the problem of maintaining the optimal throughput a challenging task. In this paper, we propose and analyze a new reactive access barring scheme for NB+IoT systems based on machine learning (ML) techniques. Specifically, we first demonstrate that knowing the number of user equipments (UE) expierencing a collision at the BS is sufficient to make conclusions about the current offered traffic load. Then, we show that utilizing ML-based techniques, one can safely differentiate between events in the Physical Random Access Channel (PRACH) at the base station (BS) side based on only the signal-to-noise ratio (SNR). Finally, we mathematically characterize the delay experienced under the proposed reactive access barring technique. In our numerical results, we show that by utilizing modern neural network approaches, such as the XGBoost classifier, one can precisely differentiate between events on the PRACH channel with accuracy reaching 0.98 and then associate it with the number of user equipment (UE) competing at the random access phase. Our simulation results show that the proposed approach can keep the successful preamble transmission probability constant at approximately 0.3 in overloaded conditions, when for conventional NB-IoT access, this value is less than 0.05. The proposed scheme achieves near-optimal throughput in multi-channel ALOHA by employing dynamic traffic awareness to adjust the non-unit transmission probability. This proactive congestion control ensures a controlled and bounded delay, preventing latency from exceeding the system’s maximum load capacity.

Article
Engineering
Telecommunications

Alessandro Vizzarri

Abstract: The Satellite Internet of Things (IoT) sector is undergoing rapid transformation, driven by breakthroughs in satellite communications and the pressing need for seamless global coverage—especially in remote and poorly connected regions. In locations where terrestrial infrastructure is limited or non- existent, Low Earth Orbit (LEO) satellites are proving to be a game-changing solution, delivering low- latency and high-throughput links well-suited for IoT deployments. While North America currently dominates the market in terms of revenue, the Asia-Pacific region is projected to lead in growth rate. Nevertheless, the development of satellite IoT networks still faces hurdles, including spectrum regulation and international policy alignment. In this evolving landscape, the LoRa and LoRaWAN protocols have been enhanced to support direct communication with LEO satellites, typically operating at altitudes between 500 km and 2,000 km. This paper offers a comprehensive review of current research on LoRa/LoRaWAN technologies integrated with LEO satellite systems, also providing a performance assessment of this combined architecture in terms of theoretical achievable bitrate, Bit Error rate (BER), and path loss.

Article
Engineering
Telecommunications

Alex T. de Cock Buning

,

Ivan Vidal

,

Francisco Valera

Abstract: Connecting distributed applications across multiple cloud-native domains is growing in complexity. Applications have become containerized and fragmented across heterogeneous infrastructures, such as public clouds, edge nodes and private data centers, including emerging IoT-driven environments. Existing networking solutions like CNI plugins and service meshes have proven insufficient for providing isolated, low-latency and secure multi-cluster communication. By combining SDN control with Kubernetes abstractions, we present L2S-CES, a Kubernetes-native solution for multi-cluster layer-2 network slicing that offers flexible isolated connectivity for microservices while maintaining performance and automation. In this work, we detail the design and implementation of L2S-CES, outlining its architecture and operational workflow. We experimentally validate against state-of-the-art alternatives, and show superior isolation, reduced setup time, native support for broadcast and multicast, and minimal performance overhead. By addressing the current lack of native link-layer networking capabilities across multiple Kubernetes domains, L2S-CES provides a unified and practical foundation for deploying scalable, multi-tenant, and latency-sensitive cloud-native applications.

Article
Engineering
Telecommunications

Charalampos Papapavlou

,

Konstantinos Paximadis

,

Dan M Marom

,

Ioannis Tomkos

Abstract: Emerging services such as artificial intelligence (AI), 5G, the Internet of Things (IoT), cloud data services and teleworking are growing exponentially, pushing bandwidth needs to the limit. Space Division Multiplexing (SDM) in the spatial domain, along with Ultra-Wide Band (UWB) transmission in the spectrum domain, represent two degrees of freedom that will play a crucial role in the evolution of backbone optical networks. SDM and UWB technologies necessitate the replacement of conventional Wave-length-Selective-Switch (WSS)-based architectures with innovative optical switching elements capable of handling both higher port counts and flexible switching across various granularities. In this work, we introduce a novel Photonic Integrated Circuit (PIC)-based switching element called flex-Waveband Selective Switch (WBSS), designed to provide flexible band switching across the UWB spectrum (~21 THz). The proposed flex-WBSS supports a hierarchical three-layered Multi-Granular Optical Node (MG-ON) architecture incorporating optical switching across various granularities ranging from entire fibers, flexibly defined bands down to individual wavelengths. To evaluate its performance, we develop a custom network simulator, enabling a thorough performance analysis on all node’s critical performance metrics. Simulations are conducted over an existing network topology evaluating three traffic-oriented switching policies: Full Fiber Switching (FFS), Waveband Switching (WBS), and Wavelength Switching (WS). Simu-lation results reveal high Optical to Signal Ratio (OSNR) and low Bit Error Rate (BER) values, particularly under the FFS policy. In contrast, the integration of the WBS policy bridges the gap between existing WSS- and future FFS-based architectures, manages to relax bandwidth demands and mitigates capacity bottlenecks enabling rapid scalable network upgrades in existing infrastructures. Additionally, we propose a probabilistic framework to evaluate the node’s bandwidth utilization and scaling behavior, exploring tradeoffs among scalability, components’ number and complexity. The proposed framework can be easily adapted for the design of future transport optical networks. Finally, we perform a SWaP-C (Size, Weight, Power and Cost) analysis to underscore the advantages of the proposed architecture. Our results confirm that the proposed node incorporating PIC-based flex-WBSSs, prove to be a cost- and power-efficient solution compared to its counterparts, as reduce excessive initial network deployment/upgrade costs while at the same time minimizing power resource requirements.

Article
Engineering
Telecommunications

Burke Geceyatmaz

,

Fatma Tansu Hocanın

Abstract: This research presents a proposed hybrid routing protocol for Vehicular Ad-hoc Networks (VANETs), designed to address the performance trade-offs inherent in purely reactive Ad hoc On Demand Distance Vector (AODV) and proactive Optimized Link State Routing (OLSR) routing paradigms. The purpose of the research is to seamlessly integrate the strengths of AODV and OLSR into a single, context-aware framework. A significant finding is the co-design of a dynamic transmission power control mechanism that works in concert with the routing logic to adapt to fluctuating vehicle densities in real-time, effectively mitigating intermittent connectivity and the high latency characteristic of large-scale Intelligent Transportation Systems (ITS). Rigorous evaluation under high-fidelity mobility scenarios (using NS-3 and SUMO with real-world traffic patterns) confirms the protocol's efficacy. The significant findings demonstrate substantial performance enhancements over established baseline protocols, consistently achieving a Packet Delivery Ratio (PDR) exceeding 90%, maintaining an end-to-end delay below the critical 40ms threshold, and realizing per-node energy savings of up to 60%. The conclusion is that this work provides a validated foundation for a highly reliable and efficient communication fabric, significantly enhancing the dependability of mission-critical ITS services and promoting the development of scalable, sustainable next-generation transportation networks.

Article
Engineering
Telecommunications

Dipon Saha

,

Illani Mohd Nawi

Abstract: The booming number of EVs and autonomous vehicles is driving the demand for the development of 5G and connected vehicle technologies. However, the design of compact, multi-band vehicular antennas with multiple communication standard support is complex. Traditional experience-based and parameter-sweeping approaches to antenna optimization are often inefficient and limited in scalability, while machine learning-based methods require extensive datasets, which are computationally intensive. This study proposes a microstrip planar Yagi antenna optimized for Sub-6 GHz 5G and C-V2X communication. As a way to approach antenna optimization with lower computing cost and less data, a hybrid optimization strategy is presented that combines parametric analysis and curve fitting based data visualization approaches. The proposed antenna exhibits a reflection coefficient of -31.68 dB and -29.36 dB with 700 MHz and 900 MHz bandwidths for frequencies of 3.5 GHz and 5.9 GHz, respectively. Moreover, the proposed antenna exhibits a peak gain of 7.55 dB with a size of 0.44×0.64 λ2, while achieving a peak efficiency of 90.1%. The antenna has been integrated and simulated in a model Mini Cooper to test the effectiveness of vehicular communication.

Review
Engineering
Telecommunications

Mohammed Zaid

,

Rosdiadee Nordin

,

Ibraheem Shayea

Abstract: The rapid integration of unmanned aerial vehicles (UAVs) into next-generation wireless systems demands seamless and reliable handover (HO) mechanisms to ensure continuous connectivity. However, frequent topology changes, high mobility, and dynamic channel variations make traditional HO schemes inadequate for UAV-assisted 6G networks. This paper presents a comprehensive review of existing HO optimization studies, emphasizing artificial intelligence (AI) and machine learning (ML) approaches as enablers of intelligent mobility management. The surveyed works are categorized into three main scenarios: non-UAV HOs, UAVs acting as aerial base stations, and UAVs operating as user equipment, each examined under traditional rule-based and AI/ML-based paradigms. Comparative insights reveal that while conventional methods remain effective for static or low-mobility environments, AI- and ML-driven approaches significantly enhance adaptability, prediction accuracy, and overall network robustness. Emerging techniques such as deep reinforcement learning and federated learning (FL) demonstrate strong potential for proactive, scalable, and energy-efficient HO decisions in future 6G ecosystems. The paper concludes by outlining key open issues and identifies future directions toward hybrid, distributed, and context-aware learning frameworks for resilient UAV-enabled HO management.

Review
Engineering
Telecommunications

Shujat Ali

,

Asma Abu-Samah

,

Mohammed H. Alsharif

,

Rosdiadee Nordin

,

Nauman Saqib

,

Mohammed Sani Adam

,

Umawathy Techanamurthy

,

Manzareen Mustafa

,

Nor Fadzilah Abdullah

Abstract: The next generation of wireless communication, 6G, promises a leap beyond the advances of 5G, aiming not only to increase speed but also to redefine how people, machines, and environments interact. This paper examines the evolution from 5G Advanced to 6G through a detailed review of 3GPP Releases 15-20, outlining the progression from enhanced mobile broadband to intelligent services supporting holographic communication, remote tactile interaction, and immersive XR applications. Three foundational service pillars are identified in this evolution: immersive communication, everything connected, and high-precision positioning. These advances enable transformative use cases such as virtual surgery, cooperative drone swarms, and AI-driven agriculture, demanding innovations in spectrum utilization (including sub-THz bands), AI-native network architectures, and energy-efficient device ecosystems. Future networks are expected to deliver peak data rates up to 1~Tbps, localization accuracy below 10~cm, and device densities reaching 10M/km2, while sustaining end-to-end latency under 1~ms. Across Releases 15-20, 3GPP has progressively standardized capabilities for XR, positioning, scheduling, and sustainability, while initiatives such as RedCap, Ambient IoT, and NTN extend connectivity toward global, low-power, and cost-effective coverage. Supported by programs like Hexa-X and the Next G Alliance, 6G is positioned as a fundamental redesign of wireless communication centered on intelligence, adaptability, inclusivity, and sustainability.

Article
Engineering
Telecommunications

Spyridon Louvros

,

AnupKumar Pandey

,

Brijesh Shah

,

Yashesh Buch

Abstract: This paper introduces a novel methodology that integrates 6G wireless Federated Edge Learning (FEEL) frameworks with optimization strategies spanning from the MAC layer to the Physical layer. In the context of mobile edge computing, ensuring robust channel estimation within the 6G network slicing paradigm presents critical challenges, particularly in managing data retransmissions. Inaccurate updates from distributed 6G devices can undermine the reliability of Federated Learning (FL), affecting its overall performance. To address this, we propose an AI/ML assisted algorithm for global optimization in FL-based 6G networks, where the decision-making process leverages radial basis functions (RBF) to assess options based on learned preferences rather than relying on direct evaluations of the objective function.

Review
Engineering
Telecommunications

Evelio Astaiza Hoyos

,

Héctor Fabio Bermúdez-Orozco

,

Jorge Alejandro Aldana-Gutierrez

Abstract: The advent of sixth-generation (6G) communications envisions a paradigm of ubiqui-tous intelligence and seamless physical–digital fusion, demanding unprecedented performance from the optical transport infrastructure. Achieving terabit-per-second capacities, microsecond latency, and nanosecond synchronisation precision requires a convergent, flexible, open, and AI-native x-Haul architecture that integrates commu-nication with distributed edge computing. This study conducts a systematic literature review of recent advances, challenges, and enabling optical technologies for intelligent and autonomous 6G networks. Using the PRISMA methodology, it analyses sources from IEEE, ACM, and major international conferences, complemented by standards from ITU-T, 3GPP, and O-RAN. The review examines key optical domains including Coherent PON (CPON), Spatial Division Multiplexing (SDM), Hollow-Core Fibre (HCF), Free-Space Optics (FSO), Photonic Integrated Circuits (PICs), and reconfigura-ble optical switching, together with intelligent management driven by SDN, NFV, and AI/ML. The findings reveal that achieving 6G transport targets will require synergistic integration of multiple optical technologies, AI-based orchestration, and nanosec-ond-level synchronisation through Precision Time Protocol (PTP) over fibre. However, challenges persist regarding scalability, cost, energy efficiency, and global standardisa-tion. Overcoming these barriers will demand strategic R&D investment, open and programmable architectures, early AI-native integration, and sustainability-oriented network design to make optical fibre a key enabler of the intelligent and autonomous 6G ecosystem.

Article
Engineering
Telecommunications

Steven O. Awino

,

Bakhe Nleya

Abstract: The indoor low-voltage power line network is characterized by highly irregular interferences, where background noise coexists with bursty impulsive noise originating from household appliances and switching events. Traditional noise models, which are considered monofractal models, often fail to reproduce the clustering, intermittency, and long-range dependence seen in measurement data. In this paper, a Multifractal Random Walk (MRW) framework tailored for Power Line Communication (PLC) noise modelling is developed. MRW is a continuous time limit process based on discrete time random walks with stochastic log-normal variance. As such, the formulated MRW framework introduces a stochastic volatility component that modulates Gaussian increments, thus generating heavy-tailed statistics and multifractal scaling laws which are consistent with the measured PLC noise data. Empirical validation is done through structure function analysis and covariance of log-amplitudes, both of which reveal scaling characteristics that align well with MRW simulated predictions. This proposed model captures both the bursty nature and correlation structure of impulsive PLC noise more effectively as compared to the conventional monofractal approaches, thereby providing a mathematically grounded framework for accurate noise generation and robust system-level performance evaluation of PLC networks.

of 14

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated