The search functionality is under construction.

IEICE TRANSACTIONS on Communications

Open Access
Estimation of Drone Payloads Using Millimeter-Wave Fast-Chirp-Modulation MIMO Radar

Kenshi OGAWA, Masashi KUROSAKI, Ryohei NAKAMURA

  • Full Text Views

    121

  • Cite this
  • Free PDF (8.2MB)

Summary :

With the development of drone technology, concerns have arisen about the possibility of drones being equipped with threat payloads for terrorism and other crimes. A drone detection system that can detect drones carrying payloads is needed. A drone’s propeller rotation frequency increases with payload weight. Therefore, a method for estimating propeller rotation frequency will effectively detect the presence or absence of a payload and its weight. In this paper, we propose a method for classifying the payload weight of a drone by estimating its propeller rotation frequency from radar images obtained using a millimeter-wave fast-chirp-modulation multiple-input and multiple-output (MIMO) radar. For each drone model, the proposed method requires a pre-prepared reference dataset that establishes the relationships between the payload weight and propeller rotation frequency. Two experimental measurement cases were conducted to investigate the effectiveness of our proposal. In case 1, we assessed four drones (DJI Matrice 600, DJI Phantom 3, DJI Mavic Pro, and DJI Mavic Mini) to determine whether the propeller rotation frequency of any drone could be correctly estimated. In case 2, experiments were conducted on a hovering Phantom 3 drone with several payloads in a stable position for calculating the accuracy of the payload weight classification. The experimental results indicated that the proposed method could estimate the propeller rotation frequency of any drone and classify payloads in a 250 g step with high accuracy.

Publication
IEICE TRANSACTIONS on Communications Vol.E107-B No.5 pp.419-428
Publication Date
2024/05/01
Publicized
Online ISSN
1745-1345
DOI
10.23919/transcom.2023EBP3104
Type of Manuscript
PAPER
Category
Sensing

1.  Introduction

Drones have advanced rapidly and are widely used in various fields, such as security, surveying, delivery, photography, disaster response, and agriculture in recent years [1]. However, along with their growing use, concerns have arisen about the possibility that drones can be equipped with payloads of explosives, biological and chemical weapons, and illicit materials for terrorism and other crimes [2], [3]. Therefore, antidrone systems must be able to detect the presence or absence of payloads and deal with these drones on a priority basis. Drone detection technologies, including cameras, microphones, and radars, are being actively studied and developed. Radars are attracting significant attention as an effective drone detection technology because they are not affected by weather conditions, unlike cameras and microphones [4].

Most studies on drone detection using radars rely on the micro-Doppler signatures generated by the rotation of drone propellers [5]-[9]; in these cited studies, drone models were classified based on differences in their micro-Doppler signatures. In addition, several studies have been conducted recently on detection of drones carrying payloads using micro-Doppler signatures [10]-[13]. In [10], the micro-Doppler signatures of two types of drones with different payloads were obtained using W-band, C-band, and S-band frequency-modulated continuous-wave (FMCW) radars. Different micro-Doppler signatures were observed with an increase in payload weight, and a payload weight classification algorithm based on micro-Doppler signatures was proposed. In particular, the W-band was found to be the preferred frequency band for payload classification using the FMCW radar. In [11], the micro-Doppler signatures of a drone with a payload were obtained using an S-band multistatic pulsed Doppler radar. In [12], a convolutional neural network was applied to the data acquired in [11], and payload weights were classified well. Of particular interest is a study about drones equipped with heavy payloads and dynamic payloads generating inertial forces, such as guns [13]. In this study, the micro-Doppler signatures of two types of drones were obtained using a K-band FMCW radar and a W-band continuous-wave radar. The authors discussed the effects of payloads on micro-Doppler signatures and showed that these signatures were inconsistent and not unique to the drones carrying the target payloads. [12] used micro-Doppler signatures for achieving a highly accurate payload classification, similar to [10] and [11]. Furthermore, [13] reported that no unique micro-Doppler signatures could clearly distinguish between drones with and without a payload. Hence, the robust discrimination between payload and no payload is challenging. These results show that depending on the radar specifications and measurement environments, the payload estimation using micro-Doppler signatures may be difficult. Therefore, methods for estimating payload weights that do not rely on micro-Doppler signatures should be explored. [13] and [14] revealed that the rotation frequency of a propeller increases with the payload weight due to the need for additional thrust. The increase trend of the propeller rotation frequency depends on the drone model. Therefore, combined with existing algorithms for classifying drone models, such trends can be used as a reference dataset for estimating payload weights.

In this paper, we propose a method for classifying the payload weight of a drone by estimating its propeller rotation frequency from radar images obtained using a millimeter-wave fast-chirp-modulation multiple-input and multiple-output (mmW FCM MIMO) radar. The proposed method requires a pre-prepared reference dataset that relates the payload weight to the propeller rotation frequency for each drone model. To the best of our knowledge, the proposed method is the first report of a payload estimation method that does not rely on micro-Doppler signatures when investigating the radar-based payload classification. We studied the radar imaging of a drone using an mmW FCM MIMO radar in [15]. The results showed that the propeller rotation produced periodic variations in the signal intensity of the pixels corresponding to the propeller in radar images. The sampling period of an mmW FCM MIMO radar is fast enough for observing a drone’s propeller rotation. The use of the millimeter-wave radar in W-band is the preferred choice for the payload estimation, as revealed in [10], and the radar is considered to reflect off small components, such as drone propellers, due to its wavelength characteristics. The rotation frequency of a propeller can be estimated by applying fast Fourier transform (FFT) to the signal intensity variations. To demonstrate the estimation of the rotation frequency of propellers, we conducted measurement experiments on four drones: DJI Matrice 600, DJI Phantom 3, DJI Mavic Pro, and DJI Mavic Mini. Additionally, we performed experiments on a drone with several payloads in a stable position to investigate the effectiveness of the proposed method for estimating payload weights from estimated rotation frequencies.

The rest of this paper is organized as follows. Section 2 is an explanation of the mmW FCM MIMO radar, radar imaging, and the payload weight estimation method. Section 3 shows our measurement results and a discussion of the effectiveness of our proposal. Finally, we summarize this paper in Sect. 4.

2.  Payload Weight Estimation

2.1  mmW FCM MIMO Radar

Figure 1 shows a diagram of the mmW FCM MIMO radar. The FCM radar transmits and receives a sinusoidal signal called chirp, whose frequency is modulated over an ultrawide bandwidth with time. The modulation and observation times of a chirp are called fast and slow times, respectively. A received chirp is mixed with a transmitted chirp to measure the intermediate-frequency (IF) signal. The IF signal is sampled using an analog-to-digital converter for each receive antenna and stored in memory as multiple-input, multiple-output (MIMO) channel data. The MIMO channel data, consisting of the IF signals of the radio channels between the transmit and receive antennas, are reconstructed into single-input, multiple-output channel data of a contiguous virtual array (MIMO virtual array) [16]. The received matrix \(\boldsymbol{R}(n,m,l)\) obtained using the radar is a 3D data matrix (MIMO virtual array \(\times\) fast time \(\times\) slow time) that includes the propagation delay time, direction of arrival (DOA), and Doppler frequency. Here \(N(n=1, 2, \cdots, N)\) is the number of fast-time samples, \(M(m=1, 2, \cdots, M)\) is the number of MIMO virtual array elements, and \(L(l=1, 2, \cdots, L)\) is the number of slow-time samples.

Fig. 1  mmW FCM MIMO radar.

2.2  Radar Imaging Procedure

Figure 2 shows the signal processing flow for 2D radar image generation. A 2D FFT process is performed on the received matrix \(\boldsymbol{R}(n,m,l)\) to generate a 2D radar image (range-angle map). The distance from the radar to the object is estimated by performing FFT (range FFT) on the IF signal obtained by each element constituting the MIMO virtual array. The data matrix \(\boldsymbol{R}_{range}(r,m,l)\) after range FFT is as follows:

\[\begin{equation*} \boldsymbol{R}_{range}(r,m,l) = \frac{1}{N }\sum_{n=1}^N \boldsymbol{R}(n,m,l) e^{ -j2\pi f_n \frac{2r}{c} }, \tag{1} \end{equation*}\]

where \(r\) and \(c\) are the range bin and the speed of light, respectively. \(f_n\) represents the frequency of the kernel of the Fourier transform.

Fig. 2  Flow of digital signal processing.

A drone has many scattering points from its components, such as its body and propellers. The spatial resolution must be improved to obtain clear radar images. As shown in Fig. 2, we apply the Khatri-Rao (KR) product virtual array processing to the MIMO virtual array elements in each range bin [17]-[19] to improve the angular resolution. Here, assuming that \(K\) waves are observed using \(M\) uniform linear array (ULA) elements, the MIMO virtual array data \(\boldsymbol{R}_{range}(r_b,m,l)\) in a certain range bin \(r_b\) are as follows:

\[\begin{eqnarray*} &&\!\!\!\!\! \boldsymbol{R}_{range}(r_b,m,l) = \sum_{k=1}^K \boldsymbol{a}(\theta_{k})s_{k}(l)+\boldsymbol{n}(l) \nonumber\\ &&\!\!\!\!\! \hphantom{\boldsymbol{R}_{range}(r_b,m,l) } = \boldsymbol{As}(l)+\boldsymbol{n}(l) \tag{2} \\ &&\!\!\!\!\! \boldsymbol{A} = [\boldsymbol{a}(\theta_{1}), \boldsymbol{a}(\theta_{2}), \cdots, \boldsymbol{a}(\theta_{k})] \tag{3} \\ &&\!\!\!\!\! \boldsymbol{s}(l) = [s_{1}(l), s_{2}(l), \cdots, s_{k}(l)]^T, \tag{4} \end{eqnarray*}\]

where \(\boldsymbol{a}(\theta_{k})\in\mathbb{C}^M\) and \(s_{k}(l)\) denote the mode vector and complex amplitude of the \(k\)-th wave, respectively; \(\boldsymbol{A}\in\mathbb{C}^{M \times K}\) is the mode matrix; and \(\boldsymbol{n}(l)\) is the noise vector. The correlation matrix \(\boldsymbol{R}_C\) of the MIMO virtual array data \(\boldsymbol{R}_{range}(r_b,m,l)\) is as follows:

\[\begin{eqnarray*} \boldsymbol{R}_C &=& E[\boldsymbol{R}_{range}(r_b,m,l) \boldsymbol{R}^{H}_{range}(r_b,m,l)] \nonumber\\ &=& \boldsymbol{AS}\boldsymbol{A}^H+\boldsymbol{R}_N, \tag{5} \end{eqnarray*}\]

where \(E[]\) and \(^H\) denote ensemble averaging and the complex conjugate transpose, respectively; \(\boldsymbol{S}\) is the source correlation matrix; and \(\boldsymbol{R}_N\) is the noise correlation matrix. We also apply spatial smoothing processing (SSP) to this correlation matrix before the KR product virtual array processing to suppress the signal coherence of incident waves [20] because the correlation of incident waves leads to errors in virtual array signals [21]. The vectorization \(\boldsymbol{y}\) of the spatially smoothed correlation matrix \(\overline{\boldsymbol{R}_C}\) is as follows:

\[\begin{eqnarray*} \boldsymbol{y} &=& vec[\overline{\boldsymbol{R}_C}] \nonumber\\ &=& vec[\boldsymbol{A} \bar{\boldsymbol{S}} \boldsymbol{A}^H]+vec[\bar{\boldsymbol{R}}_N] \nonumber\\ &=& (\boldsymbol{A}^* \odot \boldsymbol{A})s '+ vec[\bar{\boldsymbol{R}}_N], \tag{6} \end{eqnarray*}\]

where \(vec[]\) and \(^*\) are the vectorization operator and the complex conjugate, respectively; \(\odot\) denotes the KR product operator; \(\boldsymbol{s}' \in \mathbb{C}^K\) is the diagonal element of \(\bar{\boldsymbol{S}}\); \((\boldsymbol{A}^* \odot \boldsymbol{A}) \in \mathbb{C}^{M^2 \times K}\) is the KR product virtual array response matrix; and the vector \(\boldsymbol{y}\) contains repeated elements that do not help increase the aperture length. The nonrepeating elements of vector \(\boldsymbol{y}\) are extracted to obtain the KR virtual array data of \(2M-1\) elements, so the aperture length is virtually increased.

The DOA of reflected signals is estimated by performing a second FFT (angle FFT) over the indexes of the KR virtual array elements on all range bins of the data matrix \(\boldsymbol{R}_{KR}(r, m', l)\) after KR product virtual array processing. The radar image at the \(l\)-th slow time \(Image(r, a, l)\) generated after angle FFT is as follows:

\[\begin{equation*} \mathit{Image}(r, a, l)=\frac{1}{2M-1}\sum_{m'=1}^{2M-1} \boldsymbol{R}_{KR}(r, m', l) e^{-j\frac{2\pi(m'-1)}{2M-1}a}, \tag{7} \end{equation*}\]

where \(a\) is the angle bin and \(m'(=1, 2, \cdots, 2M-1)\) is the index of the virtual antennas after KR product virtual array processing.

2.3  Proposed Method

We investigated the effect of payload weight on the propeller rotation frequency of a drone (Sect. 2.3.1) and developed a payload weight estimation method using the results of this investigation (Sect. 2.3.2).

2.3.1  Reference Dataset for Payload Weight Estimation

The proposed method requires a reference dataset of the relationship between payload weight and propeller rotation frequency. Therefore, to show an example, we created a reference dataset for a hovering Phantom 3.

Figure 3 shows the environment for measuring the rotation frequency of the drone’s propeller. The hovering Phantom 3 drone was suspended in the air using guide ropes and connected to a spring scale. A payload weight was applied to the drone because the tension between a drone and a spring scale increases with the drone’s propeller rotation frequency. We measured the rotation frequency of the drone using a digital tachometer for 10 s when the spring scale showed values of 0, 250, 500, 750, and 1000 g. In this study, we consider that it is sufficient to detect a threatening payload by estimating rough weight. Therefore, measurement data were collected in a 250 g step.

Fig. 3  Measurement environment for generating reference dataset.

Figure 4 shows the measured relationship between the payload weight and rotation frequency of the Phantom 3. The figure indicates an increase in the propeller rotation frequency with the payload. When focused on each payload, it is clear that the frequency is not constant and varies between 16 and 17 Hz due to the drone’s attitude control. The frequency variations do not overlap for payloads with 250 g steps, indicating that the payload can be uniquely determined if the rotation frequency is estimated using the radar. However, these frequency variations overlap for steps below 250 g and may cause errors in the payload estimation. The measurement results obtained with 250 g steps were defined as the reference dataset for the payload weight estimation in this study.

Fig. 4  Relationship between payload weight and rotation frequency of Phantom 3.

2.3.2  Signal Processing

Figure 5 shows a flowchart of our proposed payload weight estimation method. The basis of this method is to find a pixel in a drone’s radar image that corresponds to the propeller and analyze the temporal variation of its signal intensity.

Fig. 5  Flowchart of payload weight estimation method.

First, a 2D radar image of a drone is acquired by the mmW FCM MIMO radar. As an example, the 2D radar image of the Phantom 3 is shown in Fig. 6. The characteristic shape of the drone could be imaged; specifically, the maximum peak at (0.1, 1.8 m) was an echo from the drone’s body, and the peaks at (\(-0.2\), 1.7 m) and (0.15, 1.6 m) were the echoes from the left and right propellers, respectively. Thus, a drone’s propeller is the reflection point with the largest reflection intensity after that of the body. Therefore, the initial sampling point \((r_p, a_p)\) for the propeller is the pixel of the peak of the second-largest reflection intensity in the radar image. The second and subsequent sampling points were obtained from the same coordinates. Since the reflection intensity of the pixel corresponding to the propeller fluctuates periodically with the propeller rotation, the propeller rotation frequency is estimated by performing FFT on the reflection intensity fluctuation. With the propeller position coordinates in the radar image denoted as \((r_p, a_p)\), the propeller rotation frequency \(\boldsymbol{F}(r_p, a_p, f)\) is as follows:

\[\begin{equation*} \boldsymbol{F}(r_p, a_p, f)=\frac{1}{L} \sum_{l=1}^L Image(r_p, a_p, l) e^{-j\frac{2\pi(l-1)}{L}f}, \tag{8} \end{equation*}\]

where \(f\) is frequency. The propeller rotation frequency should exceed a certain threshold for a drone to take off. A frequency gate is set for the FFT-calculated frequency spectrum to estimate the propeller rotation frequency. The propeller rotation frequency at takeoff is different for different drones due to differences in their specifications, such as drone weight and motor power. Therefore, the frequency gate depends on the drone model and should be adjusted appropriately for each drone. For example, in the case of the Phantom 3, the frequency gate was set to 150 Hz or higher because its takeoff requires a propeller rotation frequency of 150 Hz or higher. In this gate, the dominant frequency is due to propeller rotation and the peak frequency is sequentially stored in memory as a provisional estimation result of the propeller rotation frequency. Next, since the propeller rotation frequency varies with time due to disturbance, these provisional estimation results are evaluated using a histogram of 300 samples, and the frequency with the mode is used as the final estimation result of the drone’s propeller rotation frequency. A small sample size is preferred for the histogram since a large number of samples may affect the distribution because of disturbances due to long observation time. Therefore, the sample size was set to empirically derived value of 300. Finally, the payload weight is estimated by comparing the estimated propeller rotation frequency with the reference dataset.

Fig. 6  Example of 2D radar image of Phantom 3.

3.  Experimental Setup and Results

3.1  Experimental Setup

We measured propeller rotation frequencies in two experimental measurement cases using an mmW FCM MIMO radar module. Case 1 involved four drones (Matrice 600, Phantom 3, Mavic Pro, and Mavic Mini) without payloads. Case 2 involved a Phantom 3 with several payload weights. Table 1 shows the specifications of the mmW FCM MIMO radar module. The MIMO radar, which is composed of a 3\(\times\)4 ULA as shown in Fig. 7, presents a MIMO virtual array of 12 elements. Subarrays of 10 elements \((=M)\) were selected from the MIMO virtual array and used for SSP to suppress the coherence of the echoes from each target. The application of KR product virtual array processing increased the number of virtual elements to 19 elements \((=2M-1)\), so the angular resolution was 6.0 degrees. The frequency bandwidth was 3.44 GHz, resulting in a range resolution of 4.4 cm. The number of slow-time samples was 256 (0.25 s), causing a frequency resolution of 4 Hz. The pulse reception interval was set to 0.97 ms, which was fast enough for observing the propeller rotation.

Table 1  Specifications of mmW FCM MIMO radar module.

Fig. 7  MIMO radar.

In case 1, we assessed four drones with different shapes, sizes, numbers of rotors, and propeller geometries, as shown in Table 2, and investigated whether the propeller rotation frequency of any drone could be estimated correctly. Each target was placed on a low-density styrofoam cylinder with its propeller rotating, as shown in Fig. 8(a). The antenna height was set to the height of the drone body. The distance between the radar and the target was adjusted for each drone so that the entire drone, including its propellers, would be covered by the antenna beam. Each drone was positioned so that one propeller was the closest to the radar to observe the echoes from the propeller in a manner that maximizes the signal-to-noise ratio.

Table 2  Tested drones.

Fig. 8  Measurement environments.

In case 2, we tested the Phantom 3 with several payload weights \(W\) (= 0, 250, 500, 750, 1000 g) using the spring scale (Sect. 2.3.1) to investigate the effectiveness of the proposed payload weight estimation approach. The hovering target was suspended in the air using guide ropes to prevent it from flying outside the antenna beam, as shown in Fig. 8(b). The target was positioned so that the camera faced its front, as seen in Fig. 3. Since drones were expected to enter the radar coverage area at various flight altitudes, we evaluated the accuracy of the payload weight estimation method at different antenna elevation angles \(\theta\) (=0\(^\circ\), 10\(^\circ\), 20\(^\circ\), and 30\(^\circ\)).

3.2  Experimental Results and Discussion
3.2.1  Case 1

Figure 9 shows the Phantom 3 measurement results. Several strong echoes are seen in Fig. 9(a). The strong peak at (0, 1.2 m) is an echo from the body, and the peaks at (0, 1.0 m), (\(-0.30\), 1.1 m), and (0.25, 1.2 m) are the echoes from the propellers. Since the rear propeller was obscured by the body, no echo from the rear propeller is observed. Fig. 9(b) shows the waveform of the signal intensity fluctuation due to a propeller. This waveform was generated through the time-series sampling of the signal intensity of the (0, 1.0 m) pixel, which corresponds to a propeller in the 2D radar image. The DC component of the waveform was removed. The waveform amplitude fluctuates due to changes in the radar cross section during propeller rotation. The fluctuation period is related to the propeller rotation speed, and similar periodic fluctuations are observed in the other tested drones.

Fig. 9  Phantom 3 measurement results.

Figure 10 shows the frequency spectrum of the time waveform of each drone. The frequency corresponding to the maximum value in the frequency spectrum is denoted as \(\blacktriangledown\) in the figure, which is the estimated propeller rotation frequency. The true value of the propeller rotation frequency was measured using a digital tachometer. Figures 10(a), (c), and (d) show strong peaks in the low-frequency component (under 50 Hz). These peaks may have been caused by the vibration of the drone arms due to propeller rotation; each drone was placed on the styrofoam cylinder, so the drone body could not have caused vibration. Arm vibration is a unique characteristic of drones that have separate bodies and arms, such as the Matrice 600, Mavic Pro, and Mavic Mini. These peaks can be removed through filter processing using a high-pass filter or by setting a frequency gate. Figure 11 shows the estimated and measured rotation frequencies for each drone. Measurements were obtained for 556 slow-time samples. Subsequently, a total of 300 estimates of the propeller rotation frequency were obtained by performing FFT on the measured data while shifting the FFT window length of 256 samples by one sample at a time. From Fig. 11, in Case 1, where there are almost no fluctuations other than that caused by the propeller, the propeller rotation frequency can be estimated with an error of less than a few hertz for all tested drones. The main factor that causes the estimated value to vary more than the true value is the estimation error caused by the FFT.

Fig. 10  Examples of propeller rotation frequency spectra.

Fig. 11  Estimation results of the propeller rotation frequency for each drone.

3.2.2  Case 2

Measurements were obtained for 556 slow-time samples. Further, a total of 300 propeller rotation frequency estimates were obtained by applying FFT on the measured data while shifting the FFT window length of 256 samples by one sample at a time. Figure 12 shows an example of the signal intensity waveform in Case 2, in which an increase is observed in the irregular fluctuation components compared to that exhibited by the waveform of Case 1 shown in Fig. 9(b). This irregularity is attributed to the shaking and vibration of the drone’s body during hovering. The frequency spectrum of the waveform in Fig. 12 is shown in Fig. 13, along with its corresponding estimated propeller rotation frequency (\(\blacktriangledown\)). In addition to the peak representing the propeller rotation frequency (Fig. 10(b)), the spectrum has a large peak in the low-frequency region, attributed to the shaking and vibration of the drone body. However, the propeller rotation frequency can be estimated by performing a peak search after passing the frequency spectrum through a frequency gate, similar to Case 1. Figure 14 shows the provisional estimates of the propeller rotation frequency at each payload weight. The blue circles (\(\circ\)) and red crosses (\(\times\)) in the graphs denote the correct and incorrect estimates, respectively, compared with the reference dataset. Figure 14 indicates that the estimates increase with the payload weight, as shown in Fig. 4. In addition, the correct estimates (blue circles) at each payload weight vary due to temporal changes in the propeller rotation frequency caused by drone attitude control. The incorrect estimates (red crosses) are insufficient or excessive frequencies for maintaining the drone’s hovering state. These misestimates may have been caused by random disturbances, such as body sway due to attitude control or body vibration due to propeller rotation.

Fig. 12  An example of the signal intensity waveform in Case 2.

Fig. 13  An example for the estimation result of the propeller rotation frequency in Case 2.

Fig. 14  Provisional estimates of propeller rotation frequency vs. payload weight.

The histogram of provisional estimates was evaluated to determine the final estimate of the propeller rotation frequency, thus avoiding the abovementioned misestimates. When the propeller rotates at a rotation frequency closer to the frequency boundary in the reference dataset, the estimation accuracy of the propeller rotation frequency would be affected by the bin size of the histogram. In this study, the bin size was set to 1 Hz to align with the measurement resolution of the digital tachometer. For example, Fig. 15 shows the histogram of the provisional estimates at an elevation angle \(\theta = 20^\circ\) and a payload weight \(W\) = 250 g. Most of the provisional estimates are at approximately 197 Hz, which is within the frequency range of the reference dataset at \(W\) = 250 g. However, approximately 30% of the estimates are outside the frequency range, leading to payload weight misestimation. Therefore, 197 Hz, which has the highest occurrence probability, is the propeller rotation frequency in our experiment. Final estimates presented in Fig. 14 correspond to the propeller rotation frequency determined using the mode in their histograms.

Fig. 15  Histogram of provisional estimates (\(\theta = 20^\circ\), \(W\) = 250 g).

Each payload weight is classified by comparing the propeller rotation frequency determined from the histogram with the corresponding value in the reference dataset in Fig. 4. Table 3 shows the payload weight classification results at each antenna elevation angle. Each column (row) in the table represents the instances of the estimated (actual) payload weights. “Other” means that the payload weight could not be estimated because the estimated propeller rotation frequency was outside the range of the reference dataset. To evaluate the accuracy of the payload classification, we performed 100 classification runs by taking a 54-second (55600 samples in slow-time) measurement and dividing the measured data into 100 segments (556 samples in slow-time per segment). Each cell in the table represents the probability of 100 classification runs corresponding to each measurement of the actual payload weight \(W\). The blue cells in the table represent the probability of correct classification, which is defined as the classification accuracy. The average classification accuracy of all blue cells, is more than 94.4% at each elevation angle. The results show that the proposed method can accurately classify most of the payload weights, and there is almost no difference in the average classification accuracy between elevation angles.

Table 3  Payload weight classification results.

In Table 3(b), 22% are classified as “Other” at the actual payload weight \(W\) = 250 g. This is probably because the case of \(W\) = 250 g caused more body shaking and vibration than other cases, thereby affecting the original signal intensity fluctuations of the propeller. Table 3(a), (c), and (d) show misclassifications where the payload is classified as lighter or heavier than its actual weight. Misclassifications occurred irregularly for any weight at any elevation angle, indicating the absence of a consistent error trend that depends on the elevation angle. The main reasons of these misclassifications are sudden random body shaking and frequency estimation errors caused by the FFT. Further, we discuss the FFT estimation error in detail.

We investigated the effect of the FFT window length on estimation accuracy. Figure 16 shows the classification accuracy for different FFT window lengths (\(L_{FT}\)), where \(\circ\), \(\times\), and \(+\) indicate the classification accuracy for \(\theta = 0^\circ\) and \(W\) = 1000 g, for \(\theta = 20^\circ\) and \(W\) = 0 g, and for \(\theta = 30^\circ\) and \(W\) = 500 g, respectively. Figure 16 confirms that the classification accuracy improves with longer window lengths because the frequency resolution increases with the window length. Figure 17 shows the average classification accuracies at all elevation angles and payload weights. The figure indicates that the average classification accuracy degrades in the case of \(LFT = 512\) despite the improved frequency resolution compared with that of \(LFT = 256\). With longer window lengths, the effects of drone body shaking and vibration are more likely to show in the signal intensity waveform, which is used to estimate the propeller rotation frequency. Since the frequency components due to these disturbances became the mode in the histogram, the average classification accuracy declined. Therefore, a trade-off exists between the effect of disturbances and the frequency resolution, and setting a window length that considers the effect of disturbances is important for the proposed method.

Fig. 16  Comparison of classification accuracy for different FFT window lengths.

Fig. 17  Average classification accuracy for different FFT window lengths.

4.  Conclusions

In this paper, we propose a method for classifying the payload weight of a drone by estimating the propeller rotation frequency from radar images obtained using an mmW FCM MIMO radar. The proposed method necessitates a pre-prepared reference dataset that can relate the payload weight to the propeller rotation frequency for each drone model. Two experimental measurement cases were conducted to investigate the effectiveness of our proposal. In case 1, we tested four drones to determine whether the propeller rotation frequency of any drone could be correctly estimated. The experimental results showed that the propeller rotation frequencies of all drones could be estimated. In case 2, measurement experiments were conducted on a hovering drone with five different payloads in a stable position to evaluate the accuracy of payload weight classification. Results revealed that the proposed method could classify the payloads in a 250 g step with an average accuracy of more than 94.4%. However, as the FFT window length for estimating the propeller rotation frequency increased, the classification accuracy decreased due to the increased influence of disturbances. Therefore, an appropriate window length should be set for accurate classification.

We plan to investigate the possibility of classification of payloads in moving drones at far range in the future. Moreover, we aim to implement algorithms that are robust to disturbances, such as body shaking and vibration.

Acknowledgments

This work was supported by JSPS KAKENHI Grant Number 21K04102.

References

[1] Research Briefs, 38 Ways Drones Will Impact Society: From FightingWar To ForecastingWeather, UAVs Change Everything, Retrieved Dec. 17, 2019, from https://www.cbinsights.com/research/drone-impact-society-uav/, accessed Feb. 15. 2023.
URL

[2] J.P. Yaacoub, H. Noura, O. Salman, and A. Chehab, “Security analysis of drones systems: Attacks, limitations, and recommendations,” Internet of Things, vol.11, p.100218, Sept. 2020. DOI: 10.1016/j.iot.2020.100218
CrossRef

[3] E. Vattapparamban, I. Guvenc, A.I. Yurekli, K. Akkaya, and S. Uluagac, “Drones for smart cities: Issues in cybersecurity, privacy, and public safety,” Proc. 2016 International Wireless Communications and Mobile Computing Conference, Paphos, Cyprus, pp.216-221, Sept. 2016. DOI: 10.1109/IWCMC.2016.7577060
CrossRef

[4] A. Coluccia, G. Parisi, and A. Fascista, “Detection and classification of multirotor drones in radar sensor networks: A review,” Sensors, vol.20, no.15, p.4172, July 2020. DOI: 10.3390/s20154172
CrossRef

[5] J.J.M. de Wit, R.I.A. Harmanny, and G. Premel-Cabic, “Micro-Doppler analysis of small UAVs,” Proc. 9th European Radar Conference (EuRAD), Amsterdam, Netherlands, pp.210-213, Oct. 2012.
URL

[6] P. Molchanov, K. Egiazarian, J. Astola, R.I. Harmanny, and J.J.M. de Wit, “Classification of small UAVs and birds by micro-Doppler signatures,” Proc. 10th European Radar Conference (EuRAD), Nuremberg, Germany, vol.6, no.3-4, pp.172-175, Oct. 2013. DOI: 10.1017/S1759078714000282
CrossRef

[7] S. Rahman and D.A. Robertson, “Millimeter-wave micro-Doppler measurements of small UAVs,” Proc. SPIE Defense + Security, Radar Sensor Technology XXI, Anaheim, CA, United States, vol.10188, pp.307-315, May 2017. DOI: 10.1117/12.2261942
CrossRef

[8] S. Rahman and D.A. Robertson, “Multiple drone classification using millimeter-wave CW radar micro-Doppler data,” Proc. SPIE Defense + Commercial Sensing, Radar Sensor Technology XXIV, vol.11408, pp.50-57, April 2020. DOI: 10.1117/12.2558435
CrossRef

[9] M. Kurosaki, K. Ogawa, R. Nakamura, and H. Hadama, “Experimental study on multiple drone detection using a millimeter-wave fast chirp MIMO radar,” Proc. 2023 IEEE Topical Conference on Wireless Sensors and Sensor Networks (WisNet), Las Vegas, NV, USA, pp.16-19, Jan. 2023. DOI: 10.1109/WiSNeT56959.2023.10046220
CrossRef

[10] D. Dhulashia, N. Peters, C. Horne, P. Beasley, and M. Ritchie, “Multi-frequency radar micro-Doppler based classification of micro-drone payload weight,” Front. Signal Process., vol.1, p.781777, Dec. 2021. DOI: 10.3389/frsip.2021.781777
CrossRef

[11] M. Ritchie, F. Fioranelli, H. Borrion, and H. Griffiths, “Multistatic micro-Doppler radar feature extraction for classification of unloaded/loaded micro-drones,” IET Radar, Sonar & Navigation, vol.11, no.1, pp.116-124, Jan. 2017. DOI: 10.1049/iet-rsn.2016.0063
CrossRef

[12] J.S. Patel, C. Al-Ameri, F. Fioranelli, and D. Anderson, “Multi-time frequency analysis and classification of a micro-drone carrying payloads using multistatic radar,” The Journal of Engineering, vol.2019, no.20, pp.7047-7051, Oct. 2019. DOI: 10.1049/joe.2019.0551
CrossRef

[13] S. Rahman, D.A. Robertson, and M.A. Govoni, “Radar signatures of drones equipped with heavy payloads and dynamic payloads generating inertial forces,” IEEE Access, vol.8, pp.220542-220556, Dec. 2020. DOI: 10.1109/ACCESS.2020.3042798
CrossRef

[14] O.A. Ibrahim, S. Sciancalepore, and R.D. Pietro, “Noise2Weight: On detecting payload weight from drones acoustic emissions,” Future Generation Computer Systems, vol.134, pp.319-333, Sept. 2022. DOI: 10.1016/j.future.2022.03.041
CrossRef

[15] K. Ogawa, M. Kurosaki, R. Nakamura, and H. Hadama, “2D imaging of a drone using a millimeter-wave fast chirp MIMO radar based on Khatri-Rao product virtual array processing,” Proc. 2023 IEEE Topical Conference on Wireless Sensors and Sensor Networks (WisNet), Las Vegas, NV, USA, pp.1-4, Jan. 2023. DOI: 10.1109/WiSNeT56959.2023.10046224
CrossRef

[16] J. Li and P. Stoica, MIMO Radar Signal Processing, Wiley-IEEE Press, 2008.
CrossRef

[17] W.K. Ma, T.H. Hsieh, and C.Y. Chi, “DOA estimation of quasi-stationary signals via Khatri-Rao subspace,” Proc. 2009 IEEE International Conference on Acoustics, Speech and Signal Processing, Taipei, Taiwan, pp.2165-2168, April 2009. DOI: 10.1109/ICASSP.2009.4960046
CrossRef

[18] W.K. Ma, T.H. Hsieh, and C.Y. Chi, “DOA estimation of quasi-stationary signals with less sensors than sources and unknown spatial noise covariance: A Khatri-Rao subspace approach,” IEEE Trans. Signal Process., vol.58, no.4, pp.2168-2180, April 2010. DOI: 10.1109/TSP.2009.2034935
CrossRef

[19] H. Yamada, N. Ozawa, Y. Yamaguchi, K. Hirano, and H. Ito, “Angular resolution improvement of ocean surface current radar based on the Khatri-Rao product array processing,” IEICE Trans. Commun., vol.E96-B, no.10, pp.2469-2474, Oct. 2013. DOI: 10.1587/transcom.E96.B.2469
CrossRef

[20] S.U. Pillai and B.H. Kwon, “Forward/backward spatial smoothing techniques for coherent signal identification,” IEEE Trans. Acoust., Speech, Signal Process., vol.37, no.1, pp.8-15, Jan. 1989. DOI: 10.1109/29.17496
CrossRef

[21] S. Shirai, H. Yamada, and Y. Yamaguchi, “A novel DOA estimation error reduction preprocessing scheme of correlated waves for Khatri-Rao product extended-array,” IEICE Trans. Commun., vol.E96-B, no.10, pp.2475-2482, Oct. 2013. DOI: 10.1587/transcom.E96.B. 2475
CrossRef

Authors

Kenshi OGAWA
  National Defense Academy of Japan

received the B.E. and M.E. degrees in Information Engineering from The University of Kitakyushu, Japan, in 2016 and 2018, respectively. Since 2022, he has been a student at the Graduate School of Science and Engineering, National Defense Academy of Japan. He is currently focusing on microwave/millimeter radio propagation and radar systems, and working for his D.E degree. He is a student member of the IEICE and IEEE.

Masashi KUROSAKI
  National Defense Academy of Japan

received the B.E. and M.E. degrees from the Graduate School of Science and Engineering, National Defense Academy of Japan, in 2017 and 2023, respectively. Until 2023, he was working on microwave/millimeter radio propagation and radar systems.

Ryohei NAKAMURA
  National Defense Academy of Japan

received B.E., M.E., and D.E. degrees in information engineering from The University of Kitakyushu, Japan, in 2009, 2011, and 2014, respectively. In 2014, he was a research associate in the Department of Communication Engineering, National Defense Academy of Japan, and has been an Associate Professor in the same department since 2020. His major research interests include wireless communications, microwave/millimeter radio propagation, radar sensor systems, and network systems. He is a member of the IEICE and IEEE.

Keyword