Wireless networks are the backbone of modern connectivity, serving billions of 5G users through millions of cell sites globally. The opportunities and benefits of AI-RAN are driving the transformation of telecom networks and ecosystems toward AI-native wireless networks. The mission is to create an intelligent network fabric that connects hundreds of billions of AI-powered endpoints such as��
]]>The wireless industry stands at the brink of a transformation, driven by the fusion of AI with advanced 5G and upcoming 6G technologies that promise unparalleled speeds, ultra-low latency, and seamless connectivity for billions of AI-powered endpoints. 6G specifically will be AI-native, enabling integrated sensing and communications, supporting immersive technologies like extended reality and��
]]>Discover how telcos use AI to transform networks, customer experiences, and open business opportunities.
]]>NVIDIA 6G Developer Day 2024 brought together members of the 6G research and development community to share insights and learn new ways of engaging with NVIDIA 6G research tools. More than 1,300 academic and industry researchers from across the world attended the virtual event. It featured presentations from NVIDIA, ETH Z��rich, Keysight, Northeastern University, Samsung, Softbank��
]]>AI is transforming industries, enterprises, and consumer experiences in new ways. Generative AI models are moving towards reasoning, agentic AI is enabling new outcome-oriented workflows and physical AI is enabling endpoints like cameras, robots, drones, and cars to make decisions and interact in real time. The common glue between all these use cases is the need for pervasive, reliable��
]]>5G global connections numbered nearly 2 billion earlier this year, and are projected to reach 7.7 billion by 2028. While 5G has delivered faster speeds, higher capacity, and improved latency, particularly for video and data traffic, the initial promise of creating new revenues for network operators has remained elusive. Most mobile applications are now routed to the cloud. At the same time��
]]>Inferencing for generative AI and AI agents will drive the need for AI compute infrastructure to be distributed from edge to central clouds. IDC predicts that ��Business AI (consumer excluded) will contribute $19.9 trillion to the global economy and account for 3.5% of GDP by 2030.�� 5G networks must also evolve to serve this new incoming AI traffic. At the same time, there is an opportunity��
]]>The journey to 6G has begun, offering opportunities to deliver a network infrastructure that is performant, efficient, resilient, and adaptable. 6G networks will be significantly more complex than their predecessors and will rely on a variety of new technologies, especially AI and machine learning (ML). To advance these new technologies and optimize network performance and efficiency��
]]>Today��s 5G New Radio (5G NR) wireless communication systems rely on highly optimized signal processing algorithms to reconstruct transmitted messages from noisy channel observations in mere microseconds. This remarkable achievement is the result of decades of relentless effort by telecommunications engineers and researchers, who have continuously improved signal processing algorithms to meet the��
]]>Wireless networks are the essential infrastructure that enables seamless connectivity. To ensure the best performance, whether in a single building or a whole city, such networks are optimized using radio maps, which show the received signal strength across a large area, among other information. Until now, the creation of radio maps was believed to be time-consuming, limiting the utility of��
]]>The pace of 6G research and development is picking up as the 5G era crosses the midpoint of the decade-long cellular generation time frame. In this blog post, we highlight how NVIDIA is playing an active role in the emerging 6G field, enabling innovation and fostering collaboration in the industry. NVIDIA is not only delivering AI native 6G tools but is also working with partners and��
]]>Aerial CUDA-Accelerated radio access network (RAN) enables acceleration of telco workloads, delivering new levels of spectral efficiency (SE) on a cloud-native accelerated computing platform, using CPU, GPU, and DPU. NVIDIA MGX GH200 for Aerial is built on the state-of-the-art NVIDIA Grace Hopper Superchips and NVIDIA Bluefield-3 DPUs. It is designed to accelerate 5G wireless networks end-to��
]]>6G will make the telco network AI-native for the first time. To develop 6G technologies, the telecom industry needs a whole new approach to research. The world of wireless communication is on the verge of a major transformation with the advent of 6G technology. 6G, the upcoming sixth-generation wireless network, is expected to provide extremely high-performance interconnections��
]]>Hear from Amdocs, Indosat, KT, NTT, ServiceNow, Singtel, SoftBank, and Verizon, plus a special address from NVIDIA at GTC. Explore AI transforming customer service, network operations, sovereign AI factories, and AI-RAN.
]]>In 3GPP fifth generation (5G) cellular standard, layer 1 (L1) or the physical layer (PHY) is the most compute-intensive part of the radio access network (RAN) workload. It involves some of the most complex mathematical operations with sophisticated algorithms like channel estimation and equalization, modulation/demodulation, and forward error correction (FEC). These functions require high compute��
]]>NVIDIA, working with Fujitsu and Wind River, has enabled NTT DOCOMO to launch the first GPU-accelerated commercial Open RAN 5G service in its network in Japan. This makes it the first-ever telco in the world to deploy a GPU-accelerated commercial 5G network. The announcement is a major milestone as the telecom industry strives to address the multi-billion-dollar problem of driving��
]]>NVIDIA is driving fast-paced innovation in 5G software and hardware across the ecosystem with its OpenRAN-compatible 5G portfolio. Accelerated computing hardware and NVIDIA Aerial 5G software are delivering solutions for key industry stakeholders such as telcos, cloud service providers (CSPs), enterprises, and academic researchers. TMC recently named the NVIDIA MGX with NVIDIA Grace Hopper��
]]>Wireless technology has evolved rapidly and the 5G deployments have made good progress around the world. Up until recently, wireless RAN was deployed using closed-box appliance solutions by traditional RAN vendors. This closed-box approach is not scalable, underuses the infrastructure, and does not deliver optimal RAN TCO. It has many shortcomings. We have come to realize that such closed-box��
]]>The pace of 5G investment and adoption is accelerating. According to the GSMA Mobile Economy 2023 report, nearly $1.4 trillion will be spent on 5G CapEx, between 2023 and 2030. Radio access network (RAN) may account for over 60% of the spend. Increasingly, the CapEx spend is moving from the traditional approach with proprietary hardware, to virtualized RAN (vRAN) and Open RAN architectures��
]]>At this year��s Mobile World Congress (MWC), NVIDIA showcased a neural receiver? for a 5G New Radio (NR) uplink multi-user MIMO scenario, which could be seen as? a blueprint for possible 6G physical-layer architectures. For the first time, NVIDIA demonstrated a research prototype of a trainable neural network-based receiver that learns to replace significant parts of the classical physical��
]]>NVIDIA introduced Aerial Research Cloud, the first fully programmable 5G and 6G network research sandbox, which enables researchers to rapidly simulate, prototype, and benchmark innovative new software deployed through over-the-air networks. The platform democratizes 6G innovations with a full-stack, C-programmable 5G network, and jumpstarts ML in advanced wireless communications using NVIDIA��
]]>NVIDIA BlueField-3 data processing units (DPUs) are now in full production, and have been selected by Oracle Cloud Infrastructure (OCI) to achieve higher performance, better efficiency, and stronger security, as announced at NVIDIA GTC 2023. As a 400 Gb/s infrastructure compute platform, BlueField-3 enables organizations to deploy and operate data centers at massive scale.
]]>Applications are increasingly real-time and delay-sensitive, from distributed databases and 5G radio access networks (RANs) to gaming, video streaming, high-performance computing (HPC), and the metaverse. Nanosecond resolution time sync augments conventional computing in many ways, including: Through a series of collaborations, NVIDIA, Meta, and others in the Open Compute Project Time��
]]>5G deployments have been accelerating around the globe. Many telecom operators have already rolled out 5G services and are expanding rapidly. In addition to the telecom operators, there is significant interest among enterprises in using 5G to set up private networks leveraging higher bandwidth, lower latency, network slicing, mmWave, and CBRS spectrum. The 5G ramp comes at an interesting��
]]>The telecom industry has been pivotal in driving digital transformation across society. For over a century, and from fixed to mobile communications, the industry has incubated and nurtured the technologies that provide the connectivity fabric for people across the globe. In the 5G era, this pivotal role now includes providing untethered and ubiquitous high-speed data connectivity for a multitude��
]]>The role of artificial intelligence (AI) in boosting performance and energy efficiency in cellular network operations is rapidly becoming clear. This is especially the case for radio access networks (RANs), which account for over 60% of industry costs. This post explains how AI is transforming the 5G RAN, improving energy and cost efficiency while supporting better use of RAN computing��
]]>There is an ongoing demand for servers with the ability to transfer data from the network to a GPU at ever faster speeds. As AI models keep getting bigger, the sheer volume of data needed for training requires techniques such as multinode training to achieve results in a reasonable timeframe. Signal processing for 5G is more sophisticated than previous generations, and GPUs can help increase the��
]]>The current distribution of extended reality (XR) experiences is limited to desktop setups and local workstations, which contain the high-end GPUs necessary to meet computing requirements. For XR solutions to scale past their currently limited user base and support higher-end functionality such as AI services integration and on-demand collaboration, we need a purpose-built platform.
]]>The modern data center is becoming increasingly difficult to manage. There are billions of possible connection paths between applications and petabytes of log data. Static rules are insufficient to enforce security policies for dynamic microservices, and the sheer magnitude of log data is impossible for any human to analyze. AI provides the only path to the secure and self-managed data��
]]>NVIDIA Fleet Command is a cloud service that securely deploys, manages, and scales AI applications across distributed edge infrastructure. Since Fleet Command launched in July, several significant milestones have been achieved and are showcased at NVIDIA GTC. New features are constantly added to Fleet Command. In addition to making the platform more robust and secure��
]]>Extended reality (XR) has changed the way we enjoy entertainment, interact with friends, and get our jobs done. NVIDIA continues to be on the forefront of delivering groundbreaking solutions for augmented reality (AR) and virtual reality (VR), and you can experience it all at NVIDIA GTC, which runs from November 8 �C 11. This year, GTC features several sessions covering topics across��
]]>This post was originally published on the Mellanox blog. Wireless carriers have been hyping the next-generation cellular technology of 5G for years but the reality of it is certain to start rolling out this year. Wireless networks are always evolving. but this is more than a cellular upgrade. 5G not only increases speeds but offers enhancements in latency. This vastly improves responsiveness��
]]>SoftBank is a global technology player that aspires to drive the Information Revolution. The company operates in broadband, fixed-line telecommunications, ecommerce, information technology, finance, media, and marketing. To improve their users�� communication experience, and overcome the 5G capacity and coverage issues, SoftBank has used NVIDIA Maxine GPU-accelerated SDKs with state-of-the-art AI��
]]>Edge computing has been around for a long time, but has recently become a hot topic because of the convergence of three major trends �C IoT, 5G, and AI. IoT devices are becoming smarter and more capable, increasing the breadth of applications that can be deployed on them and the environments they can be deployed in. Simultaneously, recent advancements in 5G capabilities give confidence that��
]]>Many people believed delivering extended reality (XR) experiences from cloud computing systems was impossible until now. Join our webinar to learn how NVIDIA CloudXR can be used to deliver limitless virtual and augmented reality over networks (including 5G) to low cost, low-powered headsets and devices��while maintaining the high-quality experience traditionally reserved for high-end headsets that��
]]>Imagine a future where ultra-high-fidelity simulation and training applications are deployed over any network topology from a centralized secure cloud or on-premises infrastructure. Imagine that you can stream graphical training content from the datacenter to remote end devices ranging from a single flat-screen or synchronized displays to AR/VR/MR head-mounted displays.
]]>The nexus of 5G, IoT, and edge computing is turbocharging network performance. NVIDIA is working with the world��s leading telecommunications companies to build software-defined infrastructure that can meet the demand for real-time data processing at the edge for a variety of smart services. Today, we announced the first NVIDIA Aerial Developer Kit. Designed to jump start the execution of 5G��
]]>For the past decade and a half, an increasing number of businesses have moved their traditional IT applications from on-premises to public clouds. This first phase of the revolution, which you could call ��enterprise cloud transformation,�� has allowed enterprises to reap the benefits of the scale, expertise, and flexibility of the cloud. The second phase of this revolution, the ��edge AI cloud��
]]>Over the last five years, compute and storage technology have achieved substantial performance increases. At the same time, they��ve been hampered by PCI Express Gen3 (PCIe Gen3) bandwidth limitations. AMD is the first X86 processor company to release support for the PCIe fourth generation bus (PCIe Gen4) with the AMD EPYC 7002 Series processor. This is the second-generation AMD EPYC processor��
]]>5G CloudRAN is the cloud-native architecture that supports PHY layer processing for high-speed, low bandwidth, software-defined network applications. The Aerial SDK provides libraries and functions to implement L1 layer processing with LDPC optimization and other features. With O-RAN fronthaul and Aerial SDK PHY capabilities, seamless packet flow can be achieved from edge application to COTS��
]]>NVIDIA Mellanox 5T for 5G technology provides a real-time and high-performance solution for building an efficient, time-synchronized CloudRAN infrastructure. Time synchronization and achieving high time accuracy for network traffic between O-RAN 7.2x compliant front-haul, mid-haul, and back-haul components in a cloud-native RAN (CloudRAN) environment has always been a challenge. Further��
]]>Fifth-generation networks (5G) are ushering in a new era in wireless communications that delivers 1000X the bandwidth and 100X the speed at 1/10th the latency of 4G. 5G also allows for millions of connected devices per square km and is being deployed as an alternative to WiFi at edge locations like factories and retail stores. These applications demand a new network architecture that is fully��
]]>The article below is a guest post by Deepwave Digital, a technology company working to incorporate AI into radio frequency and wireless technology. In the post below, Deepwave engineers describe how they developed the first deep learning-based sensor for a 5G network. By John D. Ferguson, Peter Witkowski, William Kirschner, Daniel Bryant Deepwave Digital is using the Artificial��
]]>The next-generation 5G New Radio (NR) cellular technology design supports extremely diverse use cases, such as broadband human-oriented communications and time-sensitive applications with ultra-low latency. 5G NR operates on a very broad frequency spectrum (from sub-GHz to 100 GHz). NR employs multiple different OFDM numerologies in physical air interface to enable such diversity��
]]>