A
Machine Learning-Based Tilt Correction for Millimeter-Wave Radar Measurements
A
Yu Wang, Yang Wang, ✳JunFu Zhang, Qing Tu
Email: xhzhangjunfu@163.com
Abstract
Addressing the inclination error issue in radar water level measurement, this study investigates the fitting accuracy of three model algorithms—Random Forest, GBDT, and XGBoost—using 80 GHz millimeter-wave radar. Based on machine learning nonlinear regression and error correction principles, the research demonstrates that machine learning significantly enhances the stability and reliability of data acquisition in radar water level measurement. XGBoost demonstrated superior performance, achieving a relative improvement rate of 61.18% compared to the other two algorithms under identical conditions. This finding provides guidance for enhancing the reliability of radar water level measurements in complex field environments.
Keywords:
Millimeter-wave radar
Water surface measurement
GBDT
Random forest
XGBoost
Inclination correction
1. Introduction
In the field of water flow measurement research, upstream water level changes serve as a critical hydrological factor, playing an essential role in predicting water flow increases in downstream rivers and channels 1,2. Downstream flood prevention measures or ecological operations must be completed within hours to avoid losses caused by upstream water level changes 3,making precise and rapid measurement of upstream water level and flow variations critically important4,5. Currently, ADCP, ultrasonic flow measurement, and radar flow measurement are the most widely used methods in flow measurement68.Due to complex terrain and uncontrollable environmental factors—such as rugged topography, intricate flow conditions, extreme weather, and limited installation space—these measurement devices face stringent requirements912. Radar flow meters, with their non-contact operation, high accuracy, and long service life, represent the optimal choice for field measurements of water level and flow13.
Rapidly sloping channels and surge waves caused by changes in water kinetic energy can shift the reflection path of radar beams when measuring water surfaces1416. This leads to insufficient spatio-temporal resolution and accuracy in monitoring upstream water level changes, resulting in delayed forecasts that adversely affect the lives of downstream residents. Therefore, Anandan et al. 17 investigated the correlation with GPS data at different tilt angles, ultimately determining the optimal measurement tilt angle through statistical analysis. W. R. Li, C. Slobbe, and S. Lhermitte 18 proposed using the LEPTA method to correct systematic errors caused by the slope effect. SUN et al. 19–23 designed novel correction models for this purpose. Qian et al.24,25 presented a “detection-quantification-compensation” full-link FPGA engineering solution, addressing the challenge of real-time nonlinear correction for LFMCW signals. Zhu 26 proposed optimizing reflectance observations. While these methods mitigate inclination measurement errors to some extent, detection results still exhibit nonlinearity and randomness, demanding stringent data quality control (QC) 27. Machine learning, through model training, enables predictive fitting of nonlinear data to rapidly and accurately achieve desired outcomes2833. For instance, Sun, Opu, Lee et al. 34–36 integrated machine learning models into their respective domains, achieving accuracies of 93.6%, 96.71%, and 99.2%, respectively.
In the current literature, few studies have incorporated machine learning into the inclination correction of water level measurements. Based on this, this paper introduces machine learning to address nonlinear inclination errors in water level measurements. Among the algorithms tested, Random Forest, Gradient Boosting Decision Trees, and XGBoost demonstrated outstanding predictive performance 3739. Therefore, these three algorithms are selected to perform fitting and correction on the measurement data. The study explores how each algorithm enhances radar inclination measurements, thereby improving the reliability and stability of data when surveying steep-sloped water channels.
2 Experimental Section
2.1 Measurement Equipment and Data Acquisition
The radar employs an 80 GHz millimeter-wave radar. The water tank simulates natural stream flow conditions. The offset angle measurement device can rotate the radar to mimic measurement scenarios in steep-sloped field channels. The radar water level gauge acquisition system converts the radar signals collected by the probe into decimal values and stores them in the acquisition system, as shown in Fig. 1 below.
Fig. 1
Equipment Configuration
Click here to Correct
During data acquisition, the radar detector collects data at different tilt angles at the same height using an angle offset measurement device. After completing data collection at a given height, the lifting device adjusts the detector's elevation. The angle offset device is then reapplied to collect tilt data at the new height. This process repeats until the detector completes data acquisition.
The radar data training set spans heights from 1200mm to 1800mm, grouped in 50mm increments for a total of 13 height sets. Each height set measures 10 tilt angle groups ranging from 1° to 10°. Within each tilt angle group, measurements are taken every 5 seconds, totaling 100 measurements per group, as shown in Table 1 below.
Table 1
Training Set Collection Details
Height Range (mm)
Height Interval(mm)
Tilt Angle Range at Same Height (°)
Measurement Count at Same Tilt Angle
Measurement Interval
(s)
1200–1800
50
1–10
100
5
The radar data test set was collected at heights ranging from 1850 mm to 2000 mm, grouped in 50-mm increments, totaling four height groups. Each height group measured ten tilt angle sets ranging from 1° to 10°, with each tilt angle set measured once every 5 seconds, totaling 100 measurements per set, as shown in Table 2.
Table 2
Test Set Collection Details
Height Range (mm)
Height Interval(mm)
Tilt Angle Range at Same Height (°)
Measurement Count at Same Tilt Angle
Measurement Interval
(s)
1850–2000
50
1–10
100
5
2.2 Analyze collected data
To gain a more intuitive understanding of the trends in measurement data, the average values collected at different elevation levels and tilt angles within the training set were plotted as a 3D scatter plot, as shown below.
Fig. 2
Data Acquisition
Click here to Correct
In Fig. 2, the X-axis represents the theoretical values (true values) of the collected data; the Z-axis represents the measured values of the collected data; and the Y-axis represents the difference between the theoretical and measured values of the collected data. The figure reveals that residuals are relatively concentrated at different heights, such as between 1200-1500mm. As the radar probe rises to 1600mm or higher above the water surface, residual values become increasingly dispersed. This indicates that the greater the distance between the radar probe and the water surface, the more significant the influence of the tilt angle on measurement data, while measurement errors also increase. At the same height, when the radar sensor's tilt angle ranges from 1° to 6°, residual values cluster relatively closely, with variations not exceeding ± 5mm. This indicates minimal acquisition error. As the tilt angle increases further, residual values gradually rise, and the magnitude of variation intensifies, with residuals reaching as high as ± 20mm. At this point, the acquired data exhibits significant error.
Thus, under normal conditions, the variation in the tilt angle between the radar sensor and the water surface exhibits a non-linear increase, with larger tilt angles resulting in greater increments of change.
2.3 Data Preprocessing
Due to environmental factors, detection data may contain significant noise. To mitigate the impact of noise on the modeling process, data denoising is required prior to modeling. Denoising enhances algorithm convergence speed, improves algorithm performance, strengthens model generalization capabilities, and increases data comparability.
Simultaneously, this approach eliminates differences in data dimensions and magnitudes, improves processing efficiency within the model, and ensures a more uniform data distribution without prominent outliers. Therefore, data normalization is employed for preprocessing, with the calculation formula shown in (1).
1
Here,
represents the minimum value in the dataset,
denotes the maximum value in the dataset,
indicates the normalized value.
3 Methods and Results
3.1 Method
3.1.1 Random Forest
Random Forest is an ensemble learning method whose core principle is “collective decision-making.” It first randomly generates a large number of slightly different decision trees, then has these trees vote on the sample categories. The category receiving the most votes becomes the final prediction result40. By leveraging the principle that “many hands make light work,” this approach retains the advantages of decision trees—rapid training speed and high interpretability—while significantly mitigating their drawbacks: susceptibility to overfitting and unstable results. Through the integration and fusion of multiple decision trees, the model outputs the average of these trees. The construction of a random forest model is illustrated as follows:
2
In Eq. (2),
denotes the model prediction result,
represents the output based on
and
denotes the modeling height,
denotes the modeling height,
is an independent and identically distributed random vector, and
represents the number of regression decision trees41.
3.1.2 GBDT
GBDT is an ensemble algorithm that combines multiple weak learners in a serial manner to iteratively form a strong learner. Its core principle involves continuously fitting residuals of the loss function through decision trees, aggregating the outputs of multiple decision trees to obtain the final prediction result42,43. The decision trees (DT) used in gradient-boosted trees are all based on classification and regression trees (CART) 44. The GBDT model is constructed as follows:
3
In Eq. (3),
is the final prediction function,
is the learning rate or step size of the mth tree,
is the mth decision tree, whose parameters
are trained by fitting the negative gradient:
3.1.3 XGBoost
Extreme Gradient Boosting, also known as XGBoost, is an optimized distributed gradient boosting library designed for efficiency, flexibility, and portability. It is a scalable tree boosting system45. Conceptually, it builds upon GDBT with optimizations and extensions. Beyond fitting residuals, it incorporates second-order derivative information to more accurately approximate the objective function, thereby enhancing model fitting precision. The XGBoost model is constructed as follows:
4
In Eq. (4),
is the differentiable loss over all samples, and
is the structural regularization term for all
regression trees
where
.
denotes the number of leaf nodes in the tree,
is a hyperparameter,
is the vector composed of all leaf weights in the tree, and
is a user-defined non-negative hyperparameter.
3.2 Result
3.2.1 Multi-Model Training Performance Comparison
The Random Forest, GBDT, and XGBoost model algorithms were implemented using Python packages. During the modeling process, several key parameters required definition, with their optimal values determined via Bayesian optimization46. The optimal values are shown in Table 4. The training error results for the three algorithm models after training are depicted in Fig. 3. An evaluation of the model training error results is presented in Table 5.
Table 4
Optimal Values of Algorithm Model Parameters
Algorithm Model
Primary Parameter Name
Optimal parameter values
Random Forest
Number of decision trees in the random forest
500
Minimum number of samples required for node splitting
10
Maximum depth per decision tree
5
Minimum number of samples
2
GBDT
Number of trees
1000
Learning rate
0.01
Maximum tree depth
5
XGBoost
Number of decision trees
2800
Learning rate
0.0029
L1 regularization weight
0.2
L2 regularization weight
1.0
Fig. 3
Multi-Model Training Data Error
Click here to Correct
Table 5
Multi-Model Training Error Evaluation Metrics
Algoritm Model
MAE
MSE
RMSE
Random Forest
1.833
6.031
2.456
GBDT
0.905
2.230
1.493
XGBoost
0.642
1.037
1.018
As shown in Fig. 3, Random Forest exhibits the highest residual volatility, with a maximum positive residual of 7.32 and relatively large negative residuals. GBDT demonstrates a slightly narrower residual range, featuring a maximum positive residual of 5.94 and a negative residual of 2.99. XGBoost exhibits the smallest residual range, with a maximum positive residual of only 3.12 and a negative residual of 3.37, demonstrating the lowest overall fluctuation. Combined with Table 5, XGBoost achieves overall superiority across all metrics, reducing MAE and RMSE by 29% and 32% respectively compared to GBDT, and reducing MAE and RMSE by 64.97% and 58.55% respectively compared to Random Forest. This indicates that the XGBoost algorithm model contains fewer outliers in its fitted values compared to the other two algorithms and demonstrates stronger control over large errors. Combined with the MSE metric, where XGBoost achieves relative reductions of 82.81% and 53.5% compared to Random Forest and GBDT respectively, it confirms that XGBoost delivers the best fitting performance among the three algorithms.
3.2.2 Multi-Model Field Test Comparison
The training models of Random Forest, GBDT, and XGBoost were evaluated using the test set. The mean values obtained from 1° to 10° at different heights are plotted as shown in the figure below:
Fig. 4
Multi-Model Angular Error at Different Angles
Click here to Correct
In Fig. 4, the random forest model exhibits significant prediction instability and high sensitivity to outliers for extreme samples (residual peaks reaching − 2.1 and + 2.1). Its residual sequence shows wide-amplitude oscillations, with over 50% of residual absolute values exceeding 1.2. The GBDT model exhibits systematic positive bias: 80% of residuals are positive, forming multiple outlier peaks at positions like 2.4, 2.1, and 2.2, suggesting overfitting to noise during training. In contrast, XGBoost exhibited a more compact residual distribution, with 8 out of 10 test sets showing absolute residual values ≤ 1 and overall convergence in fluctuations. Its measured MAE ≈ 0.59, significantly lower than Random Forest (1.36) and GBDT (1.52). Notably, GBDT's observed MAE significantly exceeds its training-stage metric, further validating its overfitting tendency. Conversely, both Random Forest and XGBoost exhibit observed MAE values below their training errors, indicating superior generalization capabilities. Particularly noteworthy is XGBoost's residual distribution, which forms a narrow peak with near-zero mean and exhibits virtually no systematic bias, demonstrating excellent stability and robustness.
4 Conclusion
This study addresses the challenge of correcting measurement errors in radar water level gauges under complex field conditions by introducing machine learning algorithms to explore novel methods for enhancing radar water level measurement accuracy. The research demonstrates that Random Forest, GBDT, and XGBoost algorithms all exhibit certain effectiveness in correcting tilt in radar water level measurements. Among these, the XGBoost algorithm demonstrated optimal performance in data fitting, residual distribution symmetry, and normality due to its superior residual fitting capability and regularization techniques. It effectively corrected measurement errors, achieving up to 61.18% greater accuracy than the other two algorithms in field tests. Additionally, the XGBoost algorithm avoids overfitting issues, demonstrating superior generalization capabilities and stability. These findings offer new approaches for calibrating field-deployed radar water level gauges, potentially reducing reliance on specialized hydrologists, simplifying data processing workflows, and enhancing the accuracy and reliability of hydrological monitoring data.
In summary, machine learning-based inclination correction methods for millimeter-wave radar measurements offer significant advantages in enhancing water level measurement accuracy under complex field conditions. Among these, the XGBoost algorithm demonstrates superior performance with promising application potential and development prospects. Future research should focus on further optimizing algorithm parameters and expanding its application studies across different radar device types and more complex environments.
A
Author Contribution
Y.W: conceptualization, methodology. Y.W: data curation, investigation, original draft preparation.J.F.Z: supervision, validation, writing—review & editing.Q.T: resources, formal analysis, writing—review & editing.
Yang Wang: data curation, investigation, original draft preparation.
JunFu Zhang: supervision, validation, writing—review & editing.
Qing Tu: resources, formal analysis, writing—review & editing.
A
Funding
This work was supported by the Sichuan Provincial Engineering Technology Research Center for Modern Agricultural Equipment (Grant No. XDNY2021-005).
A
Data Availability
The datasets used and/or analysed during the current study available from the corresponding author on reasonable request.
References
1.
Liu, Y., Wang, H., Feng, W. W. & Huang, H. C. Short Term Real-Time Rolling Forecast of Urban River Water Levels Based on LSTM: A Case Study in Fuzhou City, China. Int. J. Environ. Res. Public Health. 18 https://doi.org/10.3390/ijerph18179287 (2021).
2.
Rai, R. K., van den Homberg, M. J. C., Ghimire, G. P. & McQuistan, C. Cost-benefit analysis of flood early warning system in the Karnali River Basin of Nepal. Int. J. DISASTER RISK Reduct. 47 https://doi.org/10.1016/j.ijdrr.2020.101534 (2020).
3.
Saleh, F. et al. A multi-scale ensemble-based framework for forecasting compound coastal-riverine flooding: The Hackensack-Passaic watershed and Newark Bay. Adv. Water Resour. 110, 371–386. https://doi.org/10.1016/j.advwatres.2017.10.026 (2017).
4.
Hansen, L. S., Borup, M., Moller, A. & Mikkelsen, P. S. Flow Forecasting using Deterministic Updating of Water Levels in Distributed Hydrodynamic Urban Drainage Models. WATER 6, 2195–2211 (2014). https://doi.org/10.3390/w6082195
5.
Flack, D. L. A. et al. Recommendations for Improving Integration in National End-to-End Flood Forecasting Systems: An Overview of the FFIR (Flooding From Intense Rainfall) Programme. WATER 11 (2019). https://doi.org/10.3390/w11040725
6.
Jiang, Y., Lliu, H., Li, W. H. & Tong, X. X. Z. Application of Radar Flow Meters in Flood and Waterlogging Disaster Early Warning Monitoring in the Yongding River Basin [J] Instrumentation and Analytical Monitoring. 1–5 (2025).
7.
Liu, X. X., Kang, Y. D., Li, J. N. & Jiao, Z. X. Analysis and numerical simulation of different flowmeters based on an open channel in an irrigation area. AQUA-WATER INFRASTRUCTURE Ecosyst. Soc. 73, 2079–2092. https://doi.org/10.2166/aqua.2024.206 (2024).
8.
Huang, Y. et al. Radar Technology for River Flow Monitoring: Assessment of the Current Status and Future Challenges. WATER 15 (2023). https://doi.org/10.3390/w15101904
9.
Fang, K. et al. Development of an easy-assembly and low-cost multismartphone photogrammetric monitoring system for rock slope hazards. Int. J. Rock Mech. Min. Sci. 174 https://doi.org/10.1016/j.ijrmms.2024.105655 (2024).
10.
Kang, S. & Sotiropoulos, F. Large-Eddy Simulation of Three-Dimensional Turbulent Free Surface Flow Past a Complex Stream Restoration Structure. J. Hydraul. Eng. 141 https://doi.org/10.1061/(ASCE)HY.1943-7900.0001034 (2015).
11.
Mai, Y. B., Yang, J. H. & Gao, W. Deformation behavior of hard-magnetic soft material beams under combined magnetic and mechanical forces. Arch. Appl. Mech. 95 https://doi.org/10.1007/s00419-025-02777-9 (2025).
12.
Qi, H. X., Jiang, J. Z. & Wang, C. H. Path planning for agricultural robots in wild livestock farm environments. Int. J. AGRICULTURAL Biol. Eng. 17, 207–216. https://doi.org/10.25165/j.ijabe.20241704.8632 (2024).
13.
Zhou, Y. et al. Towards Deep Radar Perception for Autonomous Driving: Datasets, Methods, and Challenges. SENSORS 22 (2022). https://doi.org/10.3390/s22114208
14.
Musch, T. A high precision 24-GHz FMCW radar based on a fractional-N ramp-PLL. IEEE Trans. Instrum. Meas. 52, 324–327. https://doi.org/10.1109/TIM.2003.810046 (2003).
15.
Carlà, T., Farina, P., Intrieri, E., Ketizmen, H. & Casagli, N. Integration of ground-based radar and satellite InSAR data for the analysis of an unexpected slope failure in an open-pit mine. Eng. Geol. 235, 39–52. https://doi.org/10.1016/j.enggeo.2018.01.021 (2018).
16.
Guarneri, H. et al. The impact of nonlinear tide-surge interaction on satellite radar altimeter-derived tides. Mar. Geodesy. 46, 251–270. https://doi.org/10.1080/01490419.2023.2175084 (2023).
17.
Anandan, V. K., Rao, I. S. & Reddy, P. N. A study on optimum tilt angle for wind estimation using Indian MST radar. J. Atmos. Ocean. Technol. 25, 1579–1589. https://doi.org/10.1175/2008JTECHA1030.1 (2008).
18.
Li, W. R., Slobbe, C. & Lhermitte, S. A leading-edge-based method for correction of slope-induced errors in ice-sheet heights derived from radar altimetry. CRYOSPHERE 16, 2225–2243. https://doi.org/10.5194/tc-16-2225-2022 (2022).
A
19.
Sun, Y. Research on Calibration of Level Gauges Based on Devices with Large Length Ratios [J]. Metrol. Sci. Technol. 69, 3–8 (2025).
A
20.
Zhao, S. J. & Shan, Y. L. Research on attitude correction algorithm for mobile wind lidars. Meas. Sci. Technol. 35 https://doi.org/10.1088/1361-6501/ad2150 (2024).
A
21.
Wang, M. R. et al. Analysis of Ocean Tide Loading Displacement Correction for Long-Strip InSAR Measurements in Coastal Areas of France. IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING 17, 6818–6824 (2024). https://doi.org/10.1109/JSTARS.2024.3374520
A
22.
Wang, Y. D., Zhang, J., Ryzhkov, A. V. & Tang, L. C. -Band Polarimetric Radar QPE Based on Specific Differential Propagation Phase for Extreme Typhoon Rainfall. J. Atmos. Ocean. Technol. 30, 1354–1370. https://doi.org/10.1175/JTECH-D-12-00083.1 (2013).
A
23.
Lin, L. M., Bao, X. W., Zhang, S., Zhao, B. K. & Xia, W. Z. Correction to raindrop size distributions measured by PARSIVEL disdrometers in strong winds. Atmos. Res. 260 https://doi.org/10.1016/j.atmosres.2021.105728 (2021).
A
24.
Qian, R., Jiang, D. F. & Fu, W. FPGA implementation of closed-loop compensation for LFMCW signal non-linear distortions. IET Signal Proc. 13, 192–198. https://doi.org/10.1049/iet-spr.2018.5298 (2019).
A
25.
Choi, H. & Park, S. A. Survey of Machine Learning-Based System Performance Optimization Techniques. Appl. SCIENCES-BASEL. 11 https://doi.org/10.3390/app11073235 (2021).
26.
Zhu, Z. W. & Qi, Y. C. A Real-Time Bright Band Vertical Profile of Reflectivity Correction Using Multitilts of Reflectivity Data. IEEE Trans. Geosci. Remote Sens. 61 https://doi.org/10.1109/TGRS.2023.3319836 (2023).
27.
Cai, H. Q. et al. A Generalized Navigation Correction Method for Airborne Doppler Radar Data. J. Atmos. Ocean. Technol. 35, 1999–2017. https://doi.org/10.1175/JTECH-D-18-0028.1 (2018).
28.
Zhao, Y. H. et al. A Review of Macroscopic Carbon Emission Prediction Model Based on Machine Learning. SUSTAINABILITY 15 https://doi.org/10.3390/su15086876 (2023).
29.
An, Q., Rahman, S., Zhou, J. W. & Kang, J. J. A Comprehensive Review on Machine Learning in Healthcare Industry: Classification, Restrictions, Opportunities and Challenges. SENSORS 23 (2023). https://doi.org/10.3390/s23094178
30.
Eyring, V. et al. Pushing the frontiers in climate modelling and analysis with machine learning. Nat. Clim. CHANGE. https://doi.org/10.1038/s41558-024-02095-y (2024).
31.
Daidone, M., Ferrantelli, S. & Tuttolomondo, A. Machine learning applications in stroke medicine: advancements, challenges, and future prospectives. NEURAL REGENERATION Res. 19, 769–773. https://doi.org/10.4103/1673-5374.382228 (2024).
32.
Liu, S. L., Wang, L. Q., Zhang, W. A., He, Y. W. & Pijush, S. A comprehensive review of machine learning-based methods in landslide susceptibility mapping. Geol. J. 58, 2283–2301. https://doi.org/10.1002/gj.4666 (2023).
33.
Ponce-Bobadilla, A. V., Schmitt, V., Maier, C. S., Mensing, S. & Stodtmann, S. Practical guide to SHAP analysis: Explaining supervised machine learning model predictions in drug development. CTS-CLINICAL TRANSLATIONAL Sci. 17 https://doi.org/10.1111/cts.70056 (2024).
A
34.
Sun, W. et al. Machine learning-assisted rapid determination for traditional Chinese Medicine Constitution. Chin. Med. 19 https://doi.org/10.1186/s13020-024-00992-0 (2024).
A
35.
Opu, M. N. I., Islam, M. R., Kabir, M. A., Hossain, M. S. & Islam, M. M. Learn2Write: Augmented Reality and Machine Learning-Based Mobile App to Learn Writing. COMPUTERS 11 (2022). https://doi.org/10.3390/computers11010004
A
36.
Lee, J., Jang, H., Ha, S. & Yoon, Y. Android Malware Detection Using Machine Learning with Feature Selection Based on the Genetic Algorithm. MATHEMATICS 9 (2021). https://doi.org/10.3390/math9212813
37.
Rhodes, J. S., Cutler, A. & Moon, K. R. Geometry- and Accuracy-Preserving Random Forest Proximities. IEEE Trans. Pattern Anal. Mach. Intell. 45, 10947–10959. https://doi.org/10.1109/TPAMI.2023.3263774 (2023).
38.
Dong, M. Q. et al. Gradient Boosted Neural Decision Forest. IEEE Trans. Serv. Comput. 16, 330–342. https://doi.org/10.1109/TSC.2021.3133673 (2023).
39.
Zhu, H. L., Liu, H. Z., Zhou, Q. M. & Cui, A. H. A XGBoost-Based Downscaling-Calibration Scheme for Extreme Precipitation Events. IEEE Trans. Geosci. Remote Sens. 61 https://doi.org/10.1109/TGRS.2023.3294266 (2023).
40.
Zhang, Y. & Cao, J. Decision Tree Algorithms for Big Data Analysis [J] Computer Science. 43, 374–379 + 383 (2016).
41.
Guan, C. et al. Estimating Chlorophyll Content in Spartina alterniflora Leaves Using a Combination of Continuous Wavelet Analysis and Random Forest Algorithm [J] Spectroscopy and Spectral Analysis. 44, 2993–3000 (2024).
42.
Zhang, Z. D. & Jung, C. G. B. D. T. M. O. Gradient-Boosted Decision Trees for Multiple Outputs. IEEE Trans. NEURAL NETWORKS Learn. Syst. 32, 3156–3167. https://doi.org/10.1109/TNNLS.2020.3009776 (2021).
43.
Wang, J., Ma, M., He, Y. & Research on Settlement Classification Methods Based on the Gradient Boosting Decision Tree Algorithm. : A Case Study of Jinniu Town, Lujiang County [J] Urban Architecture. 22, 81–84 (2025). https://doi.org/10.19892/j.cnki.csjz.2025.08.21
44.
Wang, H., Jiao, Y., Zhao, Z. C., An, J., Ju, Y. & T., B. & A Heart Disease Prediction Method Based on Gradient Boosting Trees Optimized with Bayesian Hyperparameters [J]. J. Jilin Univ. (Science Edition). 63, 472–478. https://doi.org/10.13413/j.cnki.jdxblxb.2024223 (2025).
45.
Ester, M., Kriegel, H. P. & Xu, X. XGBoost: A scalable tree boosting system. In Proceedings of the 22Nd ACM SIGKDD International Conference on Knowledge Discovery and Data Miningvol, pg 785, GEOGRAPHICAL ANALYSIS (2022). (2016). https://doi.org/10.1111/gean.12315
46.
Gisperg, F. et al. Bayesian Optimization in Bioprocess Engineering-Where Do We Stand Today? Biotechnol. Bioeng. 122, 1313–1325. https://doi.org/10.1002/bit.28960 (2025).
Acknowledgement This study was supported by Sichuan Provincial Engineering Technology. Research Center for Modern Agricultural Equipment, XDNY2021–XDNY2005 .
Total words in MS: 2343
Total words in Title: 8
Total words in Abstract: 95
Total Keyword count: 6
Total Images in MS: 4
Total Tables in MS: 4
Total Reference count: 47