- No events scheduled for January 13, 2025.
- No events scheduled for January 14, 2025.
- No events scheduled for January 15, 2025.
Generative Diffusion Models for Network Optimization
Virtual: https://events.vtools.ieee.org/m/453702 Republic of Bashkortostan- No events scheduled for January 17, 2025.
- No events scheduled for January 18, 2025.
- No events scheduled for January 19, 2025.
Week of Events
Generative Diffusion Models for Network Optimization
Generative Diffusion Models for Network Optimization
Special Presentation by Dr. Mérouane Debbah (Khalifa U., UAE) Hosted by the Future Networks Artificial Intelligence & Machine Learning (AIML) Working Group Date/Time: Thursday, January 16th, 2025 @ 12:00 UTC Topic: Generative Diffusion Models for Network Optimization Abstract: Network optimization is a fundamental challenge in Internet-of-Things (IoT) networks, often characterized by complex features that make it difficult to solve these problems. Recently, generative diffusion models (GDMs) have emerged as a promising new approach to network optimization, with the potential to directly address these optimization problems. However, the application of GDMs in this field is still in its early stages, and there is a noticeable lack of theoretical research and empirical findings. In this study, we first explore the intrinsic characteristics of generative models. Next, we provide a concise theoretical proof and intuitive demonstration of the advantages of generative models over discriminative models in network optimization. Based on this exploration, we implement GDMs as optimizers aimed at learning high-quality solution distributions for given inputs, sampling from these distributions during inference to approximate or achieve optimal solutions. Specifically, we utilize denoising diffusion probabilistic models (DDPMs) and employ a classifier-free guidance mechanism to manage conditional guidance based on input parameters. We conduct extensive experiments across three challenging network optimization problems. By investigating various model configurations and the principles of GDMs as optimizers, we demonstrate the ability to overcome prediction errors and validate the convergence of generated solutions to optimal solutions. Speaker: Dr. Mérouane Debbah is a Professor at the Khalifa University of Science and Technology in Abu Dhabi and founding Director of the KU 6G Research Center. He is a frequent keynote speaker at international events in the field of telecommunication and AI. His research has been lying at the interface of fundamental mathematics, algorithms, statistics, information and communication sciences with a special focus on random matrix theory and learning algorithms. In the Communication field, he has been at the heart of the development of small cells (4G), Massive MIMO (5G) and Large Intelligent Surfaces (6G) technologies. In the AI field, he is known for his work on Large Language Models, distributed AI systems for networks and semantic communications. He received multiple prestigious distinctions, prizes and best paper awards (more than 40 IEEE best paper awards) for his contributions to both fields and according to research.com he is ranked as the best scientist in France in the field of Electronics and Electrical Engineering. He is an IEEE Fellow, a WWRF Fellow, a Eurasip Fellow, an AAIA Fellow, an Institut Louis Bachelier Fellow, an AIIA Fellow, and a Membre émérite SEE. He is chair of the IEEE Large Generative AI Models in Telecom (GenAINet) Emerging Technology Initiative and a member of the Marconi Prize Selection Advisory Committee. Co-sponsored by: Artificial Intelligence & Machine Learning (AIML) Working Group Virtual: https://events.vtools.ieee.org/m/453702