Articles for NET Articles for NET
- High Performance, Low Energy, and Trustworthy Blockchains Using Satelliteson October 24, 2023 at 10:00 pm
AbstractBlockchains are meant to provide an append-only sequence (ledger) of transactions. Security commonly relies on a consensus protocol in which forks in the sequence are either prevented completely or are exponentially unlikely to last more than a few blocks. This monograph proposes the design of algorithms and a system to achieve high performance (a few seconds from the time of initiation for transactions to enter the blockchain), the absence of forks, and a very low energy cost (a per transaction cost that is a factor of a billion or more less than bitcoin).The foundational component of this setup is a group of satellites whose blockchain protocol code can be verified and burned into read-only memory. Because such satellites can perhaps be destroyed but cannot be captured (unlike even fortified terrestrial servers), a reasonable assumption is that the blockchain protocol code in the satellites may fail to make progress either permanently or intermittently but will not be traitorous.A second component of this setup is a group of terrestrial sites whose job is to broadcast information about blocks and to summarize the blockchain ledger. These can be individuals who are eager to get a fee for service. Even if many of these behave traitorously (against their interests as fee-collectors), a small number of honest ones is sufficient to ensure safety and liveness.A third component of this setup is a Mission Control entity which will act very occasionally to assign roles to terrestrial sites and time slots to satellites. These assignments will be multisigned using the digital signatures of a widely distributed group of human governors. A reasonable assumption on Mission Control is that, for reputational reasons, they will not send any signed message that would either contradict a previous message or attest to an incorrect affirmation. Because Mission Control needs to act very infrequently (to a first approximation, only when satellites fail), any actions of Mission Control can be carefully and publicly scrutinized.Given these components and these reasonable assumptions, our protocol, called Bounce, will achieve ledger functionality for arbitrarily sized blocks at under five seconds per block (based on experiments done with the International Space Station) and at negligible energy cost.This monograph will discuss the overall architecture and algorithms of such a system, the assumptions it makes, and the guarantees it gives.Suggested CitationDennis Shasha, Taegyun Kim, Joseph Bonneau, Yan Michalevsky, Gil Shotan and Yonatan Winetraub (2023), "High Performance, Low Energy, and Trustworthy Blockchains Using Satellites", Foundations and Trends® in Networking: Vol. 13: No. 4, pp 252-325. http://dx.doi.org/10.1561/1300000070
- Contagion Source Detection in Epidemic and Infodemic Outbreaks: Mathematical Analysis and Network Algorithmson July 3, 2023 at 10:00 pm
AbstractThe rapid spread of infectious diseases and online rumors share similarities in terms of their speed, scale, and patterns of contagion. Although these two phenomena have historically been studied separately, the COVID-19 pandemic has highlighted the devastating consequences that simultaneous crises of epidemics and misinformation can have on the world. Soon after the outbreak of COVID-19, the World Health Organization launched a campaign against the COVID-19 Infodemic, which refers to the dissemination of pandemic-related false information online that causes widespread panic and hinders recovery efforts. Undoubtedly, nothing spreads faster than fear.Networks serve as a crucial platform for viral spreading, as the actions of highly influential users can quickly render others susceptible to the same. The potential for contagion in epidemics and rumors hinges on the initial source, underscoring the need for rapid and efficient digital contact tracing algorithms to identify superspreaders or Patient Zero. Similarly, detecting and removing rumor mongers is essential for preventing the proliferation of harmful information in online social networks. Identifying the source of large-scale contagions requires solving complex optimization problems on expansive graphs. Accurate source identification and understanding the dynamic spreading process requires a comprehensive understanding of surveillance in massive networks, including topological structures and spreading veracity. Ultimately, the efficacy of algorithms for digital contact tracing and rumor source detection relies on this understanding.This monograph provides an overview of the mathematical theories and computational algorithm design for contagion source detection in large networks. By leveraging network centrality as a tool for statistical inference, we can accurately identify the source of contagions, trace their spread, and predict future trajectories. This approach provides fundamental insights into surveillance capability and asymptotic behavior of contagion spreading in networks. Mathematical theory and computational algorithms are vital to understanding contagion dynamics, improving surveillance capabilities, and developing effective strategies to prevent the spread of infectious diseases and misinformation.Suggested CitationChee Wei Tan and Pei-Duo Yu (2023), "Contagion Source Detection in Epidemic and Infodemic Outbreaks: Mathematical Analysis and Network Algorithms", Foundations and Trends® in Networking: Vol. 13: No. 2-3, pp 107-251. http://dx.doi.org/10.1561/1300000068
- In-Band Full-Duplex Radios in 6G Networks: Implementation and Applicationson June 5, 2023 at 10:00 pm
AbstractSixth-generation (6G) wireless communication networks will transform connected things in 5G into connected intelligence. The networks can have human-like cognition capabilities by enabling many potential services, such as high-accuracy localization and tracking, augmented human sense, gesture and activity recognition, etc. For this purpose, many emerging applications in 6G have stringent requirements on transmission throughput and latency. With the explosion of devices in the connected intelligence world, spectrum utilization has to be enhanced to meet these stringent requirements. In-band fullduplex (IBFD) has been reported as a promising technique to enhance spectral efficiency and reduce end-to-end latency. However, simultaneous transmission and reception over the same frequency introduce additional interference compared to conventional half-duplex (HD) radios. The receiver is exposed to the transmitter of the same node operating in IBFD mode, causing self-interference (SI), which could be more than 100dB higher than the signal of interest from other nodes due to the proximity of the transceiver. Due to the significant power difference between SI and the signal of interest (SoI), SI must be effectively suppressed to benefit from IBFD operation. In addition to SI, uplink users will interfere with downlink users within the range, known as co-channel Interference (CCI). This interference could be significant in cellular networks, so it has to be appropriately processed to maximize the IBFD gain. The objective of this monograph is to present a timely overview of selfinterference cancellation (SIC) techniques and discuss the challenges and possible solutions to implement effective SIC in 6G networks. Then, we investigate beamforming to manage the complex interference and maximize the IBFD gain in cellular networks. Furthermore, we give a deep insight into the benefits of IBFD operations on various emerging applications, e.g., integrated access and backhaul (IAB) networks, integrated sensing and communications (ISAC), and physical layer security (PLS).Suggested CitationHaifeng Luo, Abhijeet Bishnu and Tharmalingam Ratnarajah (2023), "In-Band Full-Duplex Radios in 6G Networks: Implementation and Applications", Foundations and Trends® in Networking: Vol. 13: No. 1, pp 1-105. http://dx.doi.org/10.1561/1300000067
- Distributed Coding in A Multiple
Access Environmenton June 4, 2018 at 10:00 pm
AbstractWith the fast expansion of communication networks and the increasing dynamic of wireless communication activities, a significant proportion of messages in wireless networks are being transmitted using distributed protocols that feature opportunistic channel access without full user coordination. This challenges the basic assumption of long message transmissions among coordinated users in classical channel coding theory. In this monograph, we introduce channel coding theorems for the distributed communication model where users choose their channel codes individually. We show that, although reliable message recovery is not always guaranteed in distributed communication systems, the notion of fundamental limit still exists, and can indeed be viewed as an extension to its classical correspondence. Due to historical priority of developing wireline networks, network architectures tend to achieve system modularity by compromising communication and energy efficiency. Such a choice is reasonable for wireline systems but can be disastrous for wireless radio networks. Therefore, to reduce efficiency loss, large scale communication networks often adopt wireless communication only at the last hop. Because of such a special structure, architectural inefficiency in wireless part of the network can be mitigated by enhancing the interface between the physical and the data link layers. The enhanced interface, to be proposed, provides each link layer user with multiple transmission options, and supports efficient distributed networking by enabling advanced communication adaptation at the data link layer. In this monograph, we focus on the introduction of distributed channel coding theory, which serves as the physical layer foundation for the enhanced physical-link layer interface. Nevertheless, early research results at the data link layer for the enhanced interface are also presented and discussed.Suggested CitationYanru Tang, Faeze Heydaryan and Jie Luo (2018), "Distributed Coding in A Multiple Access Environment", Foundations and Trends® in Networking: Vol. 12: No. 4, pp 260-412. http://dx.doi.org/10.1561/1300000063
- Age of Information: A New Concept, Metric, and Toolon November 27, 2017 at 11:00 pm
AbstractAge of information (AoI) was introduced in the early 2010s as a notion to characterize the freshness of the knowledge a system has about a process observed remotely. AoI was shown to be a fundamentally novel metric of timeliness, significantly different, to existing ones such as delay and latency. The importance of such a tool is paramount, especially in contexts other than transport of information, since communication takes place also to control, or to compute, or to infer, and not just to reproduce messages of a source. This volume comes to present and discuss the first body of works on AoI and discuss future directions that could yield more challenging and interesting research.Suggested CitationAntzela Kosta, Nikolaos Pappas and Vangelis Angelakis (2017), "Age of Information: A New Concept, Metric, and Tool", Foundations and Trends® in Networking: Vol. 12: No. 3, pp 162-259. http://dx.doi.org/10.1561/1300000060