12/08 2025
522
To effectively monitor the traffic environment, autonomous vehicles rely on the collaboration of various perception hardware components. Given their distinct functions, these components generate differing levels of discussion within the autonomous driving industry. Cameras and LiDAR, as the primary perception hardware, have long been the focal points of industry debate. In contrast, sensors like millimeter-wave radar and ultrasonic radar typically play supporting roles, aiding in the realization of autonomous driving technology. So, what exactly is the role of millimeter-wave radar in autonomous driving?

What is Millimeter-Wave Radar?
Millimeter-wave radar, as the name implies, is a sensor that utilizes radio waves with millimeter-level wavelengths to "see" objects around the road surface. The common operating frequency bands in vehicles are primarily 24GHz (historically) and the now mainstream 77–79GHz. The radar emits a controlled series of electromagnetic waves, typically using Frequency-Modulated Continuous Wave (FMCW), also known as a "chirp" signal in engineering. When these waves encounter objects and reflect back, the radar processes the frequency, phase, and time differences between the echo and the emitted signal, converting them into readable data. Information such as the target's distance, relative speed (determined by Doppler shift), and orientation (determined by the phase difference of the antenna array or multi-antenna synthesis) can be estimated. This process, from transmission to digital signal processing and final output, endows radar with inherent advantages in detection range, speed accuracy, and resilience to harsh weather conditions.
How Does It Measure Distance and Speed?
In simple terms, millimeter-wave radar sweeps its frequency uniformly from low to high, creating a chirp. When the echo returns, the radar receives a signal with a frequency difference from the currently emitted signal. This "beat frequency" is proportional to the round-trip propagation time, which can be converted into distance. If the target is moving, the frequency of the reflected wave will shift due to the Doppler effect, directly providing the radial velocity (the velocity component relative to the radar). By combining multiple chirps and the spatial layout of antennas (multiple transmitters or receivers), the radar can use Fourier transforms to convert time/frequency information into distance and velocity axes. Combined with array signal processing to estimate angles, it forms a three-dimensional information map based on distance-angle-velocity (sometimes referred to as the raw data of "4D radar" or "imaging radar"). Many intelligent vehicle radars have achieved tight integration of radio frequency to digital on a single chip, making them both inexpensive and compact.
What Kind of Data Does Millimeter-Wave Radar Produce?
Unlike LiDAR, which provides a dense point cloud, or cameras, which produce colored images, traditional millimeter-wave radar outputs numerous "hot spots" or "echo clusters." Each echo carries information on distance, radial velocity, and echo intensity, sometimes estimating the angle of incidence through phase information. Early radar outputs were discrete target lists (tracks), while some modern imaging radars produce heatmaps or dense depth-velocity maps. However, overall resolution (especially azimuth and cross-range resolution) still lags behind LiDAR. Millimeter-wave radar also has a special output called "micro-Doppler," which reflects internal vibration or rotation characteristics of the target, such as pedestrian arm swing or wheel rotation. This is highly useful for distinguishing target types.
The distance resolution of millimeter-wave radar is directly related to its available bandwidth; wider bandwidth allows clearer separation of closely spaced targets. Shorter wavelengths (higher frequencies) facilitate smaller, higher-angle-resolution antenna arrays. This is why the industry has shifted from 24GHz to 77/79GHz, as 77GHz offers wider permitted bandwidth, enhancing distance resolution and enabling more compact antenna array layouts (easier integration of multiple transmit-receive antennas on vehicles for high angular resolution). Many mainstream semiconductor manufacturers have made 77–79GHz their primary product line, with automotive-grade single-chip radars being introduced, driving radar adoption in vehicles.

The Role, Strengths, and Weaknesses of Millimeter-Wave Radar
What Roles Does Radar Play in the Autonomous Driving Industry?
Its most fundamental function is to act as a "goalkeeper" for distance and speed, used in scenarios like adaptive cruise control, blind spot monitoring, lane change assistance, and collision warning. Millimeter-wave radar can directly and reliably provide the relative speed and distance of vehicles ahead or to the side and rear, especially in visually impaired conditions like rain, snow, fog, and nighttime. Millimeter-wave radar also has more advanced applications, such as improving resolution through arrays and advanced processing (e.g., MIMO synthetic large aperture, time-frequency imaging, phase compensation) to generate richer representations that support target classification and pose estimation. This is why people have started talking about "imaging radar" or "4D radar" in recent years, aiming to transform radar from a coarse-grained motion sensor into one capable of semantic perception.
What Are Radar's Strengths?
Its resilience to harsh weather is a key advantage. Electromagnetic waves in the millimeter-wave band penetrate rain, snow, and some dust better than visible light. LiDAR and cameras degrade significantly in heavy rain, snow, dust storms, backlighting, or nighttime, while millimeter-wave radar usually remains stable. Additionally, millimeter-wave radar can "natively" provide speed (Doppler) information, which is difficult for cameras alone (though optical flow can estimate speed, it is greatly affected by occlusion and texture). Millimeter-wave radar is also relatively cost-effective, has a long lifespan, and is easy to conceal and protect (can be installed behind bumpers), all of which are important engineering attributes for mass-produced vehicles.
Given These Strengths, Why Has Millimeter-Wave Radar Remained in the Second Tier of Autonomous Driving Perception Hardware?
Its weaknesses are also apparent and limit its ability to handle all autonomous driving perception tasks alone. The core issue lies in resolution and semantic capabilities. The wavelength and available bandwidth of millimeter-wave radar determine its angular and distance resolution. Traditional radar struggles to distinguish small, adjacent targets (e.g., a small plastic bag and a small stone in the middle of the road). The reflection intensity of millimeter-wave radar is also greatly affected by the target's material, shape, and angle of incidence, making weak reflectors like plastic and cloth difficult to detect. In contrast, LiDAR provides denser and more intuitive geometric point clouds, while cameras offer rich color and texture information for semantic understanding (e.g., pedestrians, traffic signs, curbs, lane lines). Furthermore, millimeter-wave radar's multipath and ghosting (multipath, side-lobe) issues can create false targets, leading to false alarms or missed detections without sufficiently advanced signal/algorithm processing. In summary, "radar is good at measuring distance and speed but not so good at telling stories (semantics)."

Can Millimeter-Wave Radar Replace LiDAR?
This naturally leads us to a core topic of industry debate: can radar replace LiDAR? We've discussed this separately before (Related reading: Can millimeter-wave radar replace LiDAR given that both produce point cloud data?). Here's a brief recap. The answer is that "it is unlikely to completely replace LiDAR at present." A more accurate statement is that "radar is becoming more like LiDAR (with improved resolution and imaging capabilities), but their physical characteristics mean they excel in different areas. A more realistic short-term path is sensor fusion rather than single-sensor replacement." In many products and road tests, technical solutions combine cameras, radar, and LiDAR to complement each other's weaknesses. For example, companies like Waymo and Cruise, which develop Robotaxi services, simultaneously use LiDAR and radar across different vehicle generations, supplemented by numerous cameras for semantic discrimination, demonstrating the value of multi-sensor fusion in improving robustness in extreme scenarios. On the other hand, players like Tesla, which emphasize a "pure vision + radar" (or pure vision) approach, argue that software and large-scale data annotation can compensate for hardware gaps. The industry remains sharply divided on this issue.
Since radar cannot fully replace LiDAR, can "imaging radar" technology narrow the gap sufficiently to allow a system using only radar + cameras to achieve similar performance to LiDAR? Through MIMO (multiple input multiple output) antennas, sparse reconstruction, Doppler beam sharpening, synthetic aperture radar (SAR) concepts, and deep learning-driven post-processing, traditional radar can evolve from "outputting only a few targets" to "generating denser angle-distance images." Recent papers and industry research show that using more complex spatiotemporal signal processing and wider bandwidth can significantly improve lateral resolution and sidelobe suppression. This enables radar to see previously indistinguishable details in specific scenarios, while micro-Doppler information provides new clues for classification. However, these solutions demand higher computational resources, algorithm robustness, and labeled data volume, and still face challenges in extremely complex scenarios (e.g., occlusion, low-reflectivity small objects).
From an engineering perspective, achieving laser radar-like "explicit environmental reconstruction in spatial structure" with imaging radar involves three bottlenecks. First is hardware: wider bandwidth, denser antenna arrays, and higher linearity radio frequency links. Second is signal processing and algorithms: handling massive raw echoes, performing precise phase correction, motion calibration, and beamforming. Third is data and validation: requiring large amounts of real-world data under varying weather and road conditions to train and validate models and avoid boundary failures. While a few manufacturers' demos have shown significant progress, achieving mass production, automotive-grade compliance, low power consumption, low cost, and stable operation across millions of vehicles remains a major engineering challenge.

Future Development Trends of Millimeter-Wave Radar
From an industry perspective, millimeter-wave radar has several clear development directions. First, frequency bands and bandwidth are converging toward 77/79GHz, where the industry's mainstream device ecosystem has matured. Second, hardware integration is increasing, with radio frequency front-ends, AD/DA converters, and DSPs being integrated into single chips or compact modules, reducing costs and size. Third, MIMO, beamforming, and multi-frequency multi-mode designs are becoming standard means to improve angular resolution. Fourth, software-defined radar (SDR) and more flexible waveform designs are gaining attention to address multi-sensor electromagnetic coexistence and anti-jamming requirements. Fifth, deep fusion of radar with other sensors is becoming a mass production route, evident in the strategies of many high-level autonomous driving companies and traditional Tier 1 suppliers.
So, how should product engineers and system designers approach the rapidly advancing millimeter-wave radar technology when designing sensor stacks? We at "Intelligent Driving Frontier" offer some insights. Radar can be positioned as a "stable motion and hazard warning layer," serving as the primary sensor in low-visibility or high-relative-speed scenarios, while cameras handle semantic discrimination (identifying pedestrians, traffic signs, lane lines), and LiDAR supplements radar's weaknesses in complex geometric scenarios, fine localization, and dense mapping. For cost-sensitive passenger vehicle platforms, manufacturers may weigh the use of LiDAR (currently still expensive for high-performance models) and consider stronger imaging radar + cameras as an alternative. For fully autonomous vehicle fleets or robotaxis requiring extremely high safety margins and redundancy, retaining LiDAR in the short term is advisable for more reliable geometric perception. Sensor selection must align with a company's positioning of autonomous driving capabilities, task boundaries, and acceptable risk models.

Final Thoughts
Millimeter-wave radar is an extremely important foundational sensor in autonomous driving perception systems, providing "skeletal" spatial perception through stable distance and speed measurement, excellent resilience to harsh weather, and mass production-friendly engineering characteristics. Imaging radar and algorithmic advancements are narrowing the gap with LiDAR in resolution and semantic capabilities, but the physical characteristics of the two sensors remain complementary. In the short term, a more realistic path is "fusion rather than replacement."
-- END --