At Michurinsky State Agrarian University under the leadership of Rector Vadim Babushkin, the successful collaboration, begun in 2017, continues in the framework of the Central Committee of the National Technological Initiative in the direction of Wireless Communications and IoT.
Michurin scientists, together with representatives of universities, research centers, industrial partners, are part of a consortium organized within the NTI Competence Center Technologies of Wireless Communication and the Internet of Things. The research team of Michurinsk State Agrarian University participates in the creation of breakthrough technologies for the production, assessment of the harvest, collection, storage of agricultural products. In particular, this is the development of an algorithm for determining the quality of the crown formation of a fruit tree, research on the use of the NDVI index in order to predict yield. Development and practical implementation of an integrated system of sensors, sensors, it and precision farming technologies using the example of the Smart Garden test range.
In the context of the implementation of one of the projects on it-diagnostics of the state of fruit crops, a photo base of the main diseases of fruit crops has been formed. For a long time, various diseases of fruit trees were monitored using photofixation. More than 3,000 images were collected. Thus, to date, a base has been formed that will help agricultural producers with the help of gadgets to diagnose the current state of fruit trees and receive it-tips on effective gardening. Namely: to get an assessment of the state of the tree, to identify diseases and the scheme of their treatment.
“Agriculture is subject to risks that are often unpredictable and difficult to manage. To some extent, this is a limiting factor in the development of agriculture. The Internet of Things (IoT) is designed to reverse the situation, to make the full cycle of crop production or livestock breeding controlled by means of smart devices,” - comments Ivan Krivolapov, Ph.D.