Predictive Analytics of Varieties of Potatoes
Abstract: We explore the application of machine learning algorithms specifically to enhance the selection process of Russet potato clones in breeding trials by predicting their suitability for advancement. This study addresses the challenge of efficiently identifying high-yield, disease-resistant, and climate-resilient potato varieties that meet processing industry standards. Leveraging manually collected data from trials in the state of Oregon, we investigate the potential of a wide variety of state-of-the-art binary classification models. The dataset includes 1086 clones, with data on 38 attributes recorded for each clone, focusing on yield, size, appearance, and frying characteristics, with several control varieties planted consistently across four Oregon regions from 2013-2021. We conduct a comprehensive analysis of the dataset that includes preprocessing, feature engineering, and imputation to address missing values. We focus on several key metrics such as accuracy, F1-score, and Matthews correlation coefficient (MCC) for model evaluation. The top-performing models, namely a neural network classifier (Neural Net), histogram-based gradient boosting classifier (HGBC), and a support vector machine classifier (SVM), demonstrate consistent and significant results. To further validate our findings, we conduct a simulation study. By simulating different data-generating scenarios, we assess model robustness and performance through true positive, true negative, false positive, and false negative distributions, area under the receiver operating characteristic curve (AUC-ROC) and MCC. The simulation results highlight that non-linear models like SVM and HGBC consistently show higher AUC-ROC and MCC than logistic regression (LR), thus outperforming the traditional linear model across various distributions, and emphasizing the importance of model selection and tuning in agricultural trials.
- Akbarzadeh, S., Paap, A., Ahderom, S., Apopei, B., Alameh, K.: Plant discrimination by support vector machine classifier based on spectral reflectance. Computers and electronics in agriculture 148, 250–258 (2018) Breiman [2001] Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Colantonio et al. [2022] Colantonio, V., Ferrão, L.F.V., Tieman, D.M., Bliznyuk, N., Sims, C., Klee, H.J., Munoz, P., Resende, M.F.R.: Metabolomic selection for enhanced fruit flavor. Proceedings of the National Academy of Sciences 119(7), 2115865119 (2022) https://doi.org/10.1073/pnas.2115865119 Chicco [2017] Chicco, D.: Ten quick tips for machine learning in Computational Biology. BioData mining 10(1), 35 (2017) Camargo and Smith [2009] Camargo, A., Smith, J.: An image-processing based algorithm to automatically identify plant disease visual symptoms. Biosystems engineering 102(1), 9–21 (2009) Dwivedi et al. [2021] Dwivedi, Y.K., Hughes, L., Ismagilova, E., Aarts, G., Coombs, C., Crick, T., Duan, Y., Dwivedi, R., Edwards, J., Eirug, A., Galanos, V., Ilavarasan, P.V., Janssen, M., Jones, P., Kar, A.K., Kizgin, H., Kronemann, B., Lal, B., Lucini, B., Medaglia, R., Le Meunier-FitzHugh, K., Le Meunier-FitzHugh, L.C., Misra, S., Mogaji, E., Sharma, S.K., Singh, J.B., Raghavan, V., Raman, R., Rana, N.P., Samothrakis, S., Spencer, J., Tamilmani, K., Tubadji, A., Walton, P., Williams, M.D.: Artificial Intelligence (AI): Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. International Journal of Information Management 57, 101994 (2021) https://doi.org/10.1016/j.ijinfomgt.2019.08.002 Hinton [1990] Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Colantonio et al. [2022] Colantonio, V., Ferrão, L.F.V., Tieman, D.M., Bliznyuk, N., Sims, C., Klee, H.J., Munoz, P., Resende, M.F.R.: Metabolomic selection for enhanced fruit flavor. Proceedings of the National Academy of Sciences 119(7), 2115865119 (2022) https://doi.org/10.1073/pnas.2115865119 Chicco [2017] Chicco, D.: Ten quick tips for machine learning in Computational Biology. BioData mining 10(1), 35 (2017) Camargo and Smith [2009] Camargo, A., Smith, J.: An image-processing based algorithm to automatically identify plant disease visual symptoms. Biosystems engineering 102(1), 9–21 (2009) Dwivedi et al. [2021] Dwivedi, Y.K., Hughes, L., Ismagilova, E., Aarts, G., Coombs, C., Crick, T., Duan, Y., Dwivedi, R., Edwards, J., Eirug, A., Galanos, V., Ilavarasan, P.V., Janssen, M., Jones, P., Kar, A.K., Kizgin, H., Kronemann, B., Lal, B., Lucini, B., Medaglia, R., Le Meunier-FitzHugh, K., Le Meunier-FitzHugh, L.C., Misra, S., Mogaji, E., Sharma, S.K., Singh, J.B., Raghavan, V., Raman, R., Rana, N.P., Samothrakis, S., Spencer, J., Tamilmani, K., Tubadji, A., Walton, P., Williams, M.D.: Artificial Intelligence (AI): Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. International Journal of Information Management 57, 101994 (2021) https://doi.org/10.1016/j.ijinfomgt.2019.08.002 Hinton [1990] Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Colantonio, V., Ferrão, L.F.V., Tieman, D.M., Bliznyuk, N., Sims, C., Klee, H.J., Munoz, P., Resende, M.F.R.: Metabolomic selection for enhanced fruit flavor. Proceedings of the National Academy of Sciences 119(7), 2115865119 (2022) https://doi.org/10.1073/pnas.2115865119 Chicco [2017] Chicco, D.: Ten quick tips for machine learning in Computational Biology. BioData mining 10(1), 35 (2017) Camargo and Smith [2009] Camargo, A., Smith, J.: An image-processing based algorithm to automatically identify plant disease visual symptoms. Biosystems engineering 102(1), 9–21 (2009) Dwivedi et al. [2021] Dwivedi, Y.K., Hughes, L., Ismagilova, E., Aarts, G., Coombs, C., Crick, T., Duan, Y., Dwivedi, R., Edwards, J., Eirug, A., Galanos, V., Ilavarasan, P.V., Janssen, M., Jones, P., Kar, A.K., Kizgin, H., Kronemann, B., Lal, B., Lucini, B., Medaglia, R., Le Meunier-FitzHugh, K., Le Meunier-FitzHugh, L.C., Misra, S., Mogaji, E., Sharma, S.K., Singh, J.B., Raghavan, V., Raman, R., Rana, N.P., Samothrakis, S., Spencer, J., Tamilmani, K., Tubadji, A., Walton, P., Williams, M.D.: Artificial Intelligence (AI): Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. International Journal of Information Management 57, 101994 (2021) https://doi.org/10.1016/j.ijinfomgt.2019.08.002 Hinton [1990] Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Chicco, D.: Ten quick tips for machine learning in Computational Biology. BioData mining 10(1), 35 (2017) Camargo and Smith [2009] Camargo, A., Smith, J.: An image-processing based algorithm to automatically identify plant disease visual symptoms. Biosystems engineering 102(1), 9–21 (2009) Dwivedi et al. [2021] Dwivedi, Y.K., Hughes, L., Ismagilova, E., Aarts, G., Coombs, C., Crick, T., Duan, Y., Dwivedi, R., Edwards, J., Eirug, A., Galanos, V., Ilavarasan, P.V., Janssen, M., Jones, P., Kar, A.K., Kizgin, H., Kronemann, B., Lal, B., Lucini, B., Medaglia, R., Le Meunier-FitzHugh, K., Le Meunier-FitzHugh, L.C., Misra, S., Mogaji, E., Sharma, S.K., Singh, J.B., Raghavan, V., Raman, R., Rana, N.P., Samothrakis, S., Spencer, J., Tamilmani, K., Tubadji, A., Walton, P., Williams, M.D.: Artificial Intelligence (AI): Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. International Journal of Information Management 57, 101994 (2021) https://doi.org/10.1016/j.ijinfomgt.2019.08.002 Hinton [1990] Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Camargo, A., Smith, J.: An image-processing based algorithm to automatically identify plant disease visual symptoms. Biosystems engineering 102(1), 9–21 (2009) Dwivedi et al. [2021] Dwivedi, Y.K., Hughes, L., Ismagilova, E., Aarts, G., Coombs, C., Crick, T., Duan, Y., Dwivedi, R., Edwards, J., Eirug, A., Galanos, V., Ilavarasan, P.V., Janssen, M., Jones, P., Kar, A.K., Kizgin, H., Kronemann, B., Lal, B., Lucini, B., Medaglia, R., Le Meunier-FitzHugh, K., Le Meunier-FitzHugh, L.C., Misra, S., Mogaji, E., Sharma, S.K., Singh, J.B., Raghavan, V., Raman, R., Rana, N.P., Samothrakis, S., Spencer, J., Tamilmani, K., Tubadji, A., Walton, P., Williams, M.D.: Artificial Intelligence (AI): Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. International Journal of Information Management 57, 101994 (2021) https://doi.org/10.1016/j.ijinfomgt.2019.08.002 Hinton [1990] Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Dwivedi, Y.K., Hughes, L., Ismagilova, E., Aarts, G., Coombs, C., Crick, T., Duan, Y., Dwivedi, R., Edwards, J., Eirug, A., Galanos, V., Ilavarasan, P.V., Janssen, M., Jones, P., Kar, A.K., Kizgin, H., Kronemann, B., Lal, B., Lucini, B., Medaglia, R., Le Meunier-FitzHugh, K., Le Meunier-FitzHugh, L.C., Misra, S., Mogaji, E., Sharma, S.K., Singh, J.B., Raghavan, V., Raman, R., Rana, N.P., Samothrakis, S., Spencer, J., Tamilmani, K., Tubadji, A., Walton, P., Williams, M.D.: Artificial Intelligence (AI): Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. International Journal of Information Management 57, 101994 (2021) https://doi.org/10.1016/j.ijinfomgt.2019.08.002 Hinton [1990] Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004)
- Breiman, L.: Random forests. Machine learning 45, 5–32 (2001) Colantonio et al. [2022] Colantonio, V., Ferrão, L.F.V., Tieman, D.M., Bliznyuk, N., Sims, C., Klee, H.J., Munoz, P., Resende, M.F.R.: Metabolomic selection for enhanced fruit flavor. Proceedings of the National Academy of Sciences 119(7), 2115865119 (2022) https://doi.org/10.1073/pnas.2115865119 Chicco [2017] Chicco, D.: Ten quick tips for machine learning in Computational Biology. BioData mining 10(1), 35 (2017) Camargo and Smith [2009] Camargo, A., Smith, J.: An image-processing based algorithm to automatically identify plant disease visual symptoms. Biosystems engineering 102(1), 9–21 (2009) Dwivedi et al. [2021] Dwivedi, Y.K., Hughes, L., Ismagilova, E., Aarts, G., Coombs, C., Crick, T., Duan, Y., Dwivedi, R., Edwards, J., Eirug, A., Galanos, V., Ilavarasan, P.V., Janssen, M., Jones, P., Kar, A.K., Kizgin, H., Kronemann, B., Lal, B., Lucini, B., Medaglia, R., Le Meunier-FitzHugh, K., Le Meunier-FitzHugh, L.C., Misra, S., Mogaji, E., Sharma, S.K., Singh, J.B., Raghavan, V., Raman, R., Rana, N.P., Samothrakis, S., Spencer, J., Tamilmani, K., Tubadji, A., Walton, P., Williams, M.D.: Artificial Intelligence (AI): Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. International Journal of Information Management 57, 101994 (2021) https://doi.org/10.1016/j.ijinfomgt.2019.08.002 Hinton [1990] Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Colantonio, V., Ferrão, L.F.V., Tieman, D.M., Bliznyuk, N., Sims, C., Klee, H.J., Munoz, P., Resende, M.F.R.: Metabolomic selection for enhanced fruit flavor. Proceedings of the National Academy of Sciences 119(7), 2115865119 (2022) https://doi.org/10.1073/pnas.2115865119 Chicco [2017] Chicco, D.: Ten quick tips for machine learning in Computational Biology. BioData mining 10(1), 35 (2017) Camargo and Smith [2009] Camargo, A., Smith, J.: An image-processing based algorithm to automatically identify plant disease visual symptoms. Biosystems engineering 102(1), 9–21 (2009) Dwivedi et al. [2021] Dwivedi, Y.K., Hughes, L., Ismagilova, E., Aarts, G., Coombs, C., Crick, T., Duan, Y., Dwivedi, R., Edwards, J., Eirug, A., Galanos, V., Ilavarasan, P.V., Janssen, M., Jones, P., Kar, A.K., Kizgin, H., Kronemann, B., Lal, B., Lucini, B., Medaglia, R., Le Meunier-FitzHugh, K., Le Meunier-FitzHugh, L.C., Misra, S., Mogaji, E., Sharma, S.K., Singh, J.B., Raghavan, V., Raman, R., Rana, N.P., Samothrakis, S., Spencer, J., Tamilmani, K., Tubadji, A., Walton, P., Williams, M.D.: Artificial Intelligence (AI): Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. International Journal of Information Management 57, 101994 (2021) https://doi.org/10.1016/j.ijinfomgt.2019.08.002 Hinton [1990] Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Chicco, D.: Ten quick tips for machine learning in Computational Biology. BioData mining 10(1), 35 (2017) Camargo and Smith [2009] Camargo, A., Smith, J.: An image-processing based algorithm to automatically identify plant disease visual symptoms. Biosystems engineering 102(1), 9–21 (2009) Dwivedi et al. [2021] Dwivedi, Y.K., Hughes, L., Ismagilova, E., Aarts, G., Coombs, C., Crick, T., Duan, Y., Dwivedi, R., Edwards, J., Eirug, A., Galanos, V., Ilavarasan, P.V., Janssen, M., Jones, P., Kar, A.K., Kizgin, H., Kronemann, B., Lal, B., Lucini, B., Medaglia, R., Le Meunier-FitzHugh, K., Le Meunier-FitzHugh, L.C., Misra, S., Mogaji, E., Sharma, S.K., Singh, J.B., Raghavan, V., Raman, R., Rana, N.P., Samothrakis, S., Spencer, J., Tamilmani, K., Tubadji, A., Walton, P., Williams, M.D.: Artificial Intelligence (AI): Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. International Journal of Information Management 57, 101994 (2021) https://doi.org/10.1016/j.ijinfomgt.2019.08.002 Hinton [1990] Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Camargo, A., Smith, J.: An image-processing based algorithm to automatically identify plant disease visual symptoms. Biosystems engineering 102(1), 9–21 (2009) Dwivedi et al. [2021] Dwivedi, Y.K., Hughes, L., Ismagilova, E., Aarts, G., Coombs, C., Crick, T., Duan, Y., Dwivedi, R., Edwards, J., Eirug, A., Galanos, V., Ilavarasan, P.V., Janssen, M., Jones, P., Kar, A.K., Kizgin, H., Kronemann, B., Lal, B., Lucini, B., Medaglia, R., Le Meunier-FitzHugh, K., Le Meunier-FitzHugh, L.C., Misra, S., Mogaji, E., Sharma, S.K., Singh, J.B., Raghavan, V., Raman, R., Rana, N.P., Samothrakis, S., Spencer, J., Tamilmani, K., Tubadji, A., Walton, P., Williams, M.D.: Artificial Intelligence (AI): Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. International Journal of Information Management 57, 101994 (2021) https://doi.org/10.1016/j.ijinfomgt.2019.08.002 Hinton [1990] Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Dwivedi, Y.K., Hughes, L., Ismagilova, E., Aarts, G., Coombs, C., Crick, T., Duan, Y., Dwivedi, R., Edwards, J., Eirug, A., Galanos, V., Ilavarasan, P.V., Janssen, M., Jones, P., Kar, A.K., Kizgin, H., Kronemann, B., Lal, B., Lucini, B., Medaglia, R., Le Meunier-FitzHugh, K., Le Meunier-FitzHugh, L.C., Misra, S., Mogaji, E., Sharma, S.K., Singh, J.B., Raghavan, V., Raman, R., Rana, N.P., Samothrakis, S., Spencer, J., Tamilmani, K., Tubadji, A., Walton, P., Williams, M.D.: Artificial Intelligence (AI): Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. International Journal of Information Management 57, 101994 (2021) https://doi.org/10.1016/j.ijinfomgt.2019.08.002 Hinton [1990] Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004)
- Colantonio, V., Ferrão, L.F.V., Tieman, D.M., Bliznyuk, N., Sims, C., Klee, H.J., Munoz, P., Resende, M.F.R.: Metabolomic selection for enhanced fruit flavor. Proceedings of the National Academy of Sciences 119(7), 2115865119 (2022) https://doi.org/10.1073/pnas.2115865119 Chicco [2017] Chicco, D.: Ten quick tips for machine learning in Computational Biology. BioData mining 10(1), 35 (2017) Camargo and Smith [2009] Camargo, A., Smith, J.: An image-processing based algorithm to automatically identify plant disease visual symptoms. Biosystems engineering 102(1), 9–21 (2009) Dwivedi et al. [2021] Dwivedi, Y.K., Hughes, L., Ismagilova, E., Aarts, G., Coombs, C., Crick, T., Duan, Y., Dwivedi, R., Edwards, J., Eirug, A., Galanos, V., Ilavarasan, P.V., Janssen, M., Jones, P., Kar, A.K., Kizgin, H., Kronemann, B., Lal, B., Lucini, B., Medaglia, R., Le Meunier-FitzHugh, K., Le Meunier-FitzHugh, L.C., Misra, S., Mogaji, E., Sharma, S.K., Singh, J.B., Raghavan, V., Raman, R., Rana, N.P., Samothrakis, S., Spencer, J., Tamilmani, K., Tubadji, A., Walton, P., Williams, M.D.: Artificial Intelligence (AI): Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. International Journal of Information Management 57, 101994 (2021) https://doi.org/10.1016/j.ijinfomgt.2019.08.002 Hinton [1990] Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Chicco, D.: Ten quick tips for machine learning in Computational Biology. BioData mining 10(1), 35 (2017) Camargo and Smith [2009] Camargo, A., Smith, J.: An image-processing based algorithm to automatically identify plant disease visual symptoms. Biosystems engineering 102(1), 9–21 (2009) Dwivedi et al. [2021] Dwivedi, Y.K., Hughes, L., Ismagilova, E., Aarts, G., Coombs, C., Crick, T., Duan, Y., Dwivedi, R., Edwards, J., Eirug, A., Galanos, V., Ilavarasan, P.V., Janssen, M., Jones, P., Kar, A.K., Kizgin, H., Kronemann, B., Lal, B., Lucini, B., Medaglia, R., Le Meunier-FitzHugh, K., Le Meunier-FitzHugh, L.C., Misra, S., Mogaji, E., Sharma, S.K., Singh, J.B., Raghavan, V., Raman, R., Rana, N.P., Samothrakis, S., Spencer, J., Tamilmani, K., Tubadji, A., Walton, P., Williams, M.D.: Artificial Intelligence (AI): Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. International Journal of Information Management 57, 101994 (2021) https://doi.org/10.1016/j.ijinfomgt.2019.08.002 Hinton [1990] Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Camargo, A., Smith, J.: An image-processing based algorithm to automatically identify plant disease visual symptoms. Biosystems engineering 102(1), 9–21 (2009) Dwivedi et al. [2021] Dwivedi, Y.K., Hughes, L., Ismagilova, E., Aarts, G., Coombs, C., Crick, T., Duan, Y., Dwivedi, R., Edwards, J., Eirug, A., Galanos, V., Ilavarasan, P.V., Janssen, M., Jones, P., Kar, A.K., Kizgin, H., Kronemann, B., Lal, B., Lucini, B., Medaglia, R., Le Meunier-FitzHugh, K., Le Meunier-FitzHugh, L.C., Misra, S., Mogaji, E., Sharma, S.K., Singh, J.B., Raghavan, V., Raman, R., Rana, N.P., Samothrakis, S., Spencer, J., Tamilmani, K., Tubadji, A., Walton, P., Williams, M.D.: Artificial Intelligence (AI): Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. International Journal of Information Management 57, 101994 (2021) https://doi.org/10.1016/j.ijinfomgt.2019.08.002 Hinton [1990] Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Dwivedi, Y.K., Hughes, L., Ismagilova, E., Aarts, G., Coombs, C., Crick, T., Duan, Y., Dwivedi, R., Edwards, J., Eirug, A., Galanos, V., Ilavarasan, P.V., Janssen, M., Jones, P., Kar, A.K., Kizgin, H., Kronemann, B., Lal, B., Lucini, B., Medaglia, R., Le Meunier-FitzHugh, K., Le Meunier-FitzHugh, L.C., Misra, S., Mogaji, E., Sharma, S.K., Singh, J.B., Raghavan, V., Raman, R., Rana, N.P., Samothrakis, S., Spencer, J., Tamilmani, K., Tubadji, A., Walton, P., Williams, M.D.: Artificial Intelligence (AI): Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. International Journal of Information Management 57, 101994 (2021) https://doi.org/10.1016/j.ijinfomgt.2019.08.002 Hinton [1990] Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004)
- Chicco, D.: Ten quick tips for machine learning in Computational Biology. BioData mining 10(1), 35 (2017) Camargo and Smith [2009] Camargo, A., Smith, J.: An image-processing based algorithm to automatically identify plant disease visual symptoms. Biosystems engineering 102(1), 9–21 (2009) Dwivedi et al. [2021] Dwivedi, Y.K., Hughes, L., Ismagilova, E., Aarts, G., Coombs, C., Crick, T., Duan, Y., Dwivedi, R., Edwards, J., Eirug, A., Galanos, V., Ilavarasan, P.V., Janssen, M., Jones, P., Kar, A.K., Kizgin, H., Kronemann, B., Lal, B., Lucini, B., Medaglia, R., Le Meunier-FitzHugh, K., Le Meunier-FitzHugh, L.C., Misra, S., Mogaji, E., Sharma, S.K., Singh, J.B., Raghavan, V., Raman, R., Rana, N.P., Samothrakis, S., Spencer, J., Tamilmani, K., Tubadji, A., Walton, P., Williams, M.D.: Artificial Intelligence (AI): Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. International Journal of Information Management 57, 101994 (2021) https://doi.org/10.1016/j.ijinfomgt.2019.08.002 Hinton [1990] Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Camargo, A., Smith, J.: An image-processing based algorithm to automatically identify plant disease visual symptoms. Biosystems engineering 102(1), 9–21 (2009) Dwivedi et al. [2021] Dwivedi, Y.K., Hughes, L., Ismagilova, E., Aarts, G., Coombs, C., Crick, T., Duan, Y., Dwivedi, R., Edwards, J., Eirug, A., Galanos, V., Ilavarasan, P.V., Janssen, M., Jones, P., Kar, A.K., Kizgin, H., Kronemann, B., Lal, B., Lucini, B., Medaglia, R., Le Meunier-FitzHugh, K., Le Meunier-FitzHugh, L.C., Misra, S., Mogaji, E., Sharma, S.K., Singh, J.B., Raghavan, V., Raman, R., Rana, N.P., Samothrakis, S., Spencer, J., Tamilmani, K., Tubadji, A., Walton, P., Williams, M.D.: Artificial Intelligence (AI): Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. International Journal of Information Management 57, 101994 (2021) https://doi.org/10.1016/j.ijinfomgt.2019.08.002 Hinton [1990] Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Dwivedi, Y.K., Hughes, L., Ismagilova, E., Aarts, G., Coombs, C., Crick, T., Duan, Y., Dwivedi, R., Edwards, J., Eirug, A., Galanos, V., Ilavarasan, P.V., Janssen, M., Jones, P., Kar, A.K., Kizgin, H., Kronemann, B., Lal, B., Lucini, B., Medaglia, R., Le Meunier-FitzHugh, K., Le Meunier-FitzHugh, L.C., Misra, S., Mogaji, E., Sharma, S.K., Singh, J.B., Raghavan, V., Raman, R., Rana, N.P., Samothrakis, S., Spencer, J., Tamilmani, K., Tubadji, A., Walton, P., Williams, M.D.: Artificial Intelligence (AI): Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. International Journal of Information Management 57, 101994 (2021) https://doi.org/10.1016/j.ijinfomgt.2019.08.002 Hinton [1990] Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004)
- Camargo, A., Smith, J.: An image-processing based algorithm to automatically identify plant disease visual symptoms. Biosystems engineering 102(1), 9–21 (2009) Dwivedi et al. [2021] Dwivedi, Y.K., Hughes, L., Ismagilova, E., Aarts, G., Coombs, C., Crick, T., Duan, Y., Dwivedi, R., Edwards, J., Eirug, A., Galanos, V., Ilavarasan, P.V., Janssen, M., Jones, P., Kar, A.K., Kizgin, H., Kronemann, B., Lal, B., Lucini, B., Medaglia, R., Le Meunier-FitzHugh, K., Le Meunier-FitzHugh, L.C., Misra, S., Mogaji, E., Sharma, S.K., Singh, J.B., Raghavan, V., Raman, R., Rana, N.P., Samothrakis, S., Spencer, J., Tamilmani, K., Tubadji, A., Walton, P., Williams, M.D.: Artificial Intelligence (AI): Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. International Journal of Information Management 57, 101994 (2021) https://doi.org/10.1016/j.ijinfomgt.2019.08.002 Hinton [1990] Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Dwivedi, Y.K., Hughes, L., Ismagilova, E., Aarts, G., Coombs, C., Crick, T., Duan, Y., Dwivedi, R., Edwards, J., Eirug, A., Galanos, V., Ilavarasan, P.V., Janssen, M., Jones, P., Kar, A.K., Kizgin, H., Kronemann, B., Lal, B., Lucini, B., Medaglia, R., Le Meunier-FitzHugh, K., Le Meunier-FitzHugh, L.C., Misra, S., Mogaji, E., Sharma, S.K., Singh, J.B., Raghavan, V., Raman, R., Rana, N.P., Samothrakis, S., Spencer, J., Tamilmani, K., Tubadji, A., Walton, P., Williams, M.D.: Artificial Intelligence (AI): Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. International Journal of Information Management 57, 101994 (2021) https://doi.org/10.1016/j.ijinfomgt.2019.08.002 Hinton [1990] Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004)
- Dwivedi, Y.K., Hughes, L., Ismagilova, E., Aarts, G., Coombs, C., Crick, T., Duan, Y., Dwivedi, R., Edwards, J., Eirug, A., Galanos, V., Ilavarasan, P.V., Janssen, M., Jones, P., Kar, A.K., Kizgin, H., Kronemann, B., Lal, B., Lucini, B., Medaglia, R., Le Meunier-FitzHugh, K., Le Meunier-FitzHugh, L.C., Misra, S., Mogaji, E., Sharma, S.K., Singh, J.B., Raghavan, V., Raman, R., Rana, N.P., Samothrakis, S., Spencer, J., Tamilmani, K., Tubadji, A., Walton, P., Williams, M.D.: Artificial Intelligence (AI): Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. International Journal of Information Management 57, 101994 (2021) https://doi.org/10.1016/j.ijinfomgt.2019.08.002 Hinton [1990] Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004)
- Hinton, G.I.: Connectionist learning procedures, pp. 555–610. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (1990) Hastie et al. [2009] Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004)
- Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009) Ke et al. [2017] Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004)
- Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: A highly efficient gradient boosting decision tree. NIPS’17, vol. 30, pp. 3149–3157 (2017) Kurek et al. [2023] Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004)
- Kurek, J., Niedbała, G., Wojciechowski, T., Świderski, B., Antoniuk, I., Piekutowska, M., Kruk, M., Bobran, K.: Prediction of potato (solanum tuberosum l.) yield based on machine learning methods. Agriculture 13(12), 2259 (2023) McMaster and Wilhelm [1997] McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004)
- McMaster, G.S., Wilhelm, W.: Growing degree-days: one equation, two interpretations. Agricultural and forest meteorology 87(4), 291–300 (1997) [13] Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004)
- Ozguven, M.M., Yilmaz, G., Adem, K., Kozkurt, C.: Use of Support Vector Machines and Artificial Neural Network Methods in Variety Improvement Studies: Potato Example. Current Investigations In Agriculture and Current Research 6, 770–776 Pedregosa et al. [2011] Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004)
- Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 12, 2825–2830 (2011) Rasmussen et al. [2006] Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004)
- Rasmussen, C.E., Williams, C.K., et al.: Gaussian Processes for Machine Learning vol. 1. Springer, New York (2006) Schölkopf and Smola [2001] Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004)
- Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond vol. 4. MIT, Cambridge, MA (2001) Steinbach and Tan [2009] Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004)
- Steinbach, M., Tan, P.-N.: knn: k-nearest neighbors. The top ten algorithms in data mining, 151–162 (2009) Selvaraj et al. [2020] Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004)
- Selvaraj, M.G., Valderrama, M., Guzman, D., Valencia, M., Ruiz, H., Acharjee, A.: Machine learning for high-throughput field phenotyping and image processing provides insight into the association of above and below-ground traits in cassava (Manihot esculenta Crantz). Plant Methods 16(87), 19 (2020) https://doi.org/10.1186/s13007-020-00625-1 Tang et al. [2014] Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004)
- Tang, J., Alelyani, S., Liu, H.: Feature Selection for Classification: A Review. In: Data Classification: Algorithms and Applications vol. 56, pp. 37–64. CRC press, Boca Raton, FL (2014) Van Buuren and Groothuis-Oudshoorn [2011] Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004)
- Van Buuren, S., Groothuis-Oudshoorn, K.: mice: Multivariate imputation by chained equations in r. Journal of statistical software 45, 1–67 (2011) Williams and Rasmussen [2006] Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004)
- Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning vol. 2. MIT press, Cambridge, MA (2006) Zhang [2004] Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004) Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004)
- Zhang, H.: The optimality of naive bayes. Aa 1(2), 3 (2004)
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.