- Resource Types
- Resource Languages
- Institutional Repository
About Site Language
Magni eaque corporis laboriosam vitae placeat obcaecati quaerat nam. Perferendis, doloremque repudiandae? Tempore quod illum quae autem reiciendis.
I changed my language, but I’m still seeing resources in the other languages?
Quibusdam, error nobis harum accusamus laborum neque unde! Blanditiis inventore ad voluptatibus sapiente, natus tempore nam ullam quidem.
WHDL - 00013551
Submitted to the Department of Mathematics and Computer Science in partial fulfillment of the requirements for the degree of Bachelor of Arts
The use of small Unmanned Aircraft Systems (sUAS) for obtaining wildland imagery has enabled the production of more accurate data regarding the effects of fire on forested land. This increase in precision enables accurate detection of trees from hyperspatial imagery, and thus the calculation of canopy cover. When pre-fire data is compared with post-fire data from existing canopy cover products such as the LANDFIRE project, a measure of tree mortality, which is a measure of burn severity, can be calculated from the difference between the two. A mask region-based convolutional neural network (MR-CNN) was trained to classify trees as groups of pixels from a hyperspatial orthomosaic. The tree classification is summarized at 30m in a canopy cover raster. Canopy reduction allows the mapping of burn severity while also identifying where surface, passive, and active crown fire occurred within the observed region. The calculated canopy cover derived from hyperspatial imagery was found to be significantly more accurate when compared with LANDFIRE canopy cover in all observed regions. The hyperspatially derived canopy cover reflected visual observations from respective sUAS imagery and ground truthing.46 Resources