forked from mmistakes/minimal-mistakes
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge pull request #27 from PUTvision/corn_crop_damages_paper
News added about corn crop damages
- Loading branch information
Showing
3 changed files
with
22 additions
and
0 deletions.
There are no files selected for viewing
22 changes: 22 additions & 0 deletions
22
_posts/2024/09/2024-09-22-personal-temperature-measurement-publication.md
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,22 @@ | ||
--- | ||
layout: single | ||
title: "Estimation of corn crop damage caused by wildlife in UAV images" | ||
author: [aszkowski-przemyslaw, kraft-marek, pieczynski-dominik] | ||
modified: 2024-09-22 | ||
tags: [computer vision, agriculture, segmentation, UAV] | ||
category: [publication] | ||
teaser: "/assets/images/posts/2024/09/corn_field_thumb.webp" | ||
--- | ||
<BR> | ||
|
||
<p align="center"> | ||
<img src="/assets/images/posts/2024/09/corn_fields.webp" height="300px" /> | ||
</p> | ||
|
||
We are happy to announce our recent publication about estimating corn crop damages from UAVs images! | ||
|
||
## Abstract: | ||
|
||
This paper proposes a low-cost and low-effort solution for determining the area of corn crops damaged by the wildlife facility utilising field images collected by an unmanned aerial vehicle (UAV). The proposed solution allows for the determination of the percentage of the damaged crops and their location. The method utilises models based on deep convolutional neural networks (e.g., UNet family) and transformers (SegFormer) trained on over 300 hectares of diverse corn fields in western Poland. A range of neural network architectures was tested to select the most accurate final solution. The tests show that despite using only easily accessible RGB data available from inexpensive, consumer-grade UAVs, the method achieves sufficient accuracy to be applied in practical solutions for agriculture-related tasks, as the IoU (Intersection over Union) metric for segmentation of healthy and damaged crop reaches 0.88. | ||
|
||
You can find more about this research in our paper in Precision Agriculture journal: [http://dx.doi.org/10.1007/s11119-024-10180-7](http://dx.doi.org/10.1007/s11119-024-10180-7) |
Binary file not shown.
Binary file not shown.