Next Article in Journal
Autonomous Mission Planning Method for Optical Imaging Satellites Based on Real-Time Cloud Cover Information
Next Article in Special Issue
Attempt to Combine Physicochemical Data with Thermal Remote Sensing to Determine the Extent of Water Mixing between River and Lake
Previous Article in Journal
Progress in Compact HF Radar Measurement of Bimodal Ocean Wave Parameters
Previous Article in Special Issue
Normalized Burn Ratio Plus (NBR+): A New Index for Sentinel-2 Imagery
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Experience Gained When Using the Yuneec E10T Thermal Camera in Environmental Research

by
Adam Młynarczyk
1,*,
Sławomir Królewicz
1,
Monika Konatowska
1,2 and
Grzegorz Jankowiak
1
1
Faculty of Geographical and Geological Sciences, Adam Mickiewicz University in Poznań, Bogumiła Krygowskiego 10, 61-680 Poznań, Poland
2
Department of Forest Sites and Ecology, Faculty of Forestry and Wood Technology, Poznań University of Life Sciences, Wojska Polskiego 71D, 60-625 Poznań, Poland
*
Author to whom correspondence should be addressed.
Remote Sens. 2022, 14(11), 2633; https://doi.org/10.3390/rs14112633
Submission received: 6 May 2022 / Revised: 27 May 2022 / Accepted: 29 May 2022 / Published: 31 May 2022
(This article belongs to the Special Issue Recent Advances in GIS Techniques for Remote Sensing)

Abstract

:
Thermal imaging is an important source of information for geographic information systems (GIS) in various aspects of environmental research. This work contains a variety of experiences related to the use of the Yuneec E10T thermal imaging camera with a 320 × 240 pixel matrix and 4.3 mm focal length dedicated to working with the Yuneec H520 UAV in obtaining data on the natural environment. Unfortunately, as a commercial product, the camera is available without radiometric characteristics. Using the heated bed of the Omni3d Factory 1.0 printer, radiometric calibration was performed in the range of 18–100 °C (high sensitivity range–high gain settings of the camera). The stability of the thermal camera operation was assessed using several sets of a large number of photos, acquired over three areas in the form of aerial blocks composed of parallel rows with a specific sidelap and longitudinal coverage. For these image sets, statistical parameters of thermal images such as the mean, minimum and maximum were calculated and then analyzed according to the order of registration. Analysis of photos taken every 10 m in vertical profiles up to 120 m above ground level (AGL) were also performed to show the changes in image temperature established within the reference surface. Using the established radiometric calibration, it was found that the camera maintains linearity between the observed temperature and the measured brightness temperature in the form of a digital number (DN). It was also found that the camera is sometimes unstable after being turned on, which indicates the necessity of adjusting the device’s operating conditions to external conditions for several minutes or taking photos over an area larger than the region of interest.

1. Introduction

Remote sensing in the thermal range is based on the measurement of electromagnetic radiation emitted from the surface of an object. All objects with a temperature above absolute zero emit radiation, and the amount of energy returned is a function of the emissivity and temperature of the object’s surface [1].Thermal imaging with a digital camera is a complicated job and should be performed by a well-trained operator [2].The quantitative interpretation of the results, including the estimation of the surface temperature of the object, may be difficult due to many factors influencing the registration, such as air temperature and humidity, wind speed, distance from objects, recording time and sensor characteristics [3,4,5]. In practice, distance as an image recording parameter varies from a few millimeters or centimeters to hundreds of kilometers (thermal sensors on satellite platforms, such as Sentinel-3, Landsat series or Terra/Aster).
The emission of electromagnetic radiation is related to a wide range of wavelengths of 3–35 μm. In practice, the range of 8–14 μm is most often used in sensors mounted on UAVs or in ground measurements [6]. There are also sensors recording radiation in the shorter range of 3–5 µm. Due to the absorption of radiation by ozone in the stratosphere between 9 and 10 μm, satellite sensors operate in narrower ranges (e.g., Landsat-8 channel-10: 10.6–11.19 and channel 11: 11.5–12.51 μm). Thermal sensors are characterized by a much lower resolution of the image matrices used in comparison to cameras operating in the optical range. Typical high-end thermal imaging cameras produce images with a resolution of 640 × 512 and a refresh rate of 9–15 Hz [7,8]. A single model of a thermal sensor can be adapted to different focal lengths and sold as a separate camera model. The resolution of the thermal sensor is strongly related to its price—the larger the size of the matrix, the higher the price of the device.
An important element in expanding the application of thermal sensors is the possibility of suspending them under small UAVs and recording images from a low height above the ground, which complements other possibilities of obtaining data through contact or close distance measurement, registration from the aerial and satellite level [8,9,10,11]. Due to the combination of UAV with thermal sensors, the decision to take pictures is made in a short time between the need and the moment of taking pictures. A certain barrier to the common use of thermal sensors is their high price, which may even be several times higher than the cost of the flying platform itself when using an UAV device. Therefore, some studies are working on applying low-cost solutions [10,12,13,14].
Thermal cameras, which for a long time were used almost only in military applications, have begun to be widely used in many industrial fields [6]. One of the most common applications in the COVID-19 epidemic today is the use of thermal cameras to assess the temperature of people entering public facilities [15]. In archaeology, thermography can be used to identify elements hidden below the ground level of old buildings [16], to identify cropmarks [17,18], to visualize the course of old roads [19] or to analyze the state of archaeological structures under the influence of meteorological conditions [20]. One of the more serious applications of thermal imaging is by emergency services to search for missing persons in difficult-to-reach areas [21,22]. For the protection of strategically important industrial and military facilities, thermal cameras are used to detect UAVs [23]. Thermal imaging is also used in the protection of and research into the natural environment. These issues include monitoring of spontaneous fires in dry areas [24,25,26], pointing to places where warmer waters, including sewage, reach rivers and lakes [27], monitoring rainwater runoff in the city [28], estimating heat emissions in the area of geothermal lakes [11], imaging thermal variability of the lake surface [29,30], determining the thermal diversity of the upper forest area [31] or monitoring the life of wild animals [32,33]. Thermal imaging is widely used in construction to study the thermal integrity of buildings, which is important from the point of view of climate neutrality [13,34,35,36]. In urban areas, thermal imaging is used to detect and track pedestrians in terms of their behavior [37].
The discussion of numerous studies conducted with the use of thermal sensors to support sustainable agricultural production or precision farming methods is particularly noteworthy [9,38,39]. Thermal imaging in agriculture is very often associated with the use of UAVs [40]. One of the most important applications is the mapping and assessment of water resources available to plants [3], which is particularly related to the detection of water stress in plants [41,42,43,44] and the monitoring and optimization of irrigation systems [45,46,47]. An important branch of agriculture in which thermal imaging is used is support for viticulture [48,49,50]. Thermal imaging is used to address many issues, including assessing damage to cereal crops [51] and soil salinity in terms of its impact on crop growth [52]. In the processing of agricultural products, thermography is often used to control the quality of food products [53].
For various reasons, thermal images are often taken at night or in the morning, in the absence of sunlight, which makes RGB photos useless. In order to process thermal images photogrammetrically, they must have enough detail necessary to connect the images with each other. With a low resolution of the thermal sensor matrix, capturing images with an appropriate level of detail is possible at suitable flight altitudes (appropriate distance from the object) when using a large field of view. The authors’ experiences so far with taking pictures with the E10T camera (matrix 320 × 256, focal length f = 4.3 mm) over various natural areas indicate that the appropriate flight altitude is approximately 100 m above ground level. The sharpness of the thermal image is also influenced by the current thermal contrast of the objects. Too low a contrast makes it impossible to identify details in the photos, which is an obstacle when assembling a uniform orthophotomap. In order to reduce the impact of wind, which lowers the target temperature and humid air, acting as a shield against infrared radiation, photos should be taken in a cloudless sky and in less windy weather conditions [44]. Obtaining good-quality thermal images is also difficult in the presence of fog or atmospheric sediments on the objects (frost and rime). The legibility of thermal images is also influenced by the speed of the aircraft in relation to the image speed. The lower the speed, the better, but the selection of the flight speed is also dependent on other factors, such as the size of the area and the number of batteries (total flight time that can be achieved with their help). In the air survey above the ground, due to the different emissivity of the photographed objects or the variability of wind conditions during shooting, internal correction of the thermal sensor is impossible to use. Figure 1 presents examples of thermal images and the corresponding photos in the visible range (RGB) of natural objects, taken with the E10T camera, vertically and horizontally. The photos show fields, meadows, river backwaters, and forests at night and during the day. The most interesting photo is the river bank, where the thermal image shows the reflection of the shore on the water surface in a way very similar to images in the visible spectrum.
For a general review of thermal sensors adapted for UAV transfer, see [49]. Another review of modern thermal sensor technology in terms of autonomous aerial navigation is presented by Nguyen [54]. Thermal sensors are constructed as standalone cameras or in combination with sensors operating at other spectral ranges. Examples of such devices are the Multispectral Altum camera by Micasense and the E10 and E20 series cameras dedicated to Yuneec H520 (H520E) drones. Using a combination of different image sensors is beneficial because this provides simultaneous data registration at different ranges of electromagnetic radiation, better integration of various data, and extends the possibilities of object/surface classification, determining the state in which they are [17,18,41,48,55]. Thermal data can complement multispectral information sets, which are classified using various methods [42,46], including object classification methods [47] or methods based on machine learning algorithms and artificial intelligence [34].
One of the few thermal sensors available for the professional Yuneec H520 UAV is the E10T camera. The E10T camera was developed in cooperation with FLIR, one of the most recognizable brands on the thermal sensor market. The camera consists of a thermal sensor and an RGB sensor. The camera is available in various versions, differing in the field of view of the thermal sensor and the resolution of the thermal matrix (320 × 256 and 640 × 512). In the cheapest version, the camera is slightly more than twice as expensive as the UAV H520 in the basic version. The thermal camera allows you to record both static images and video. It was created for various purposes, such as thermal inspection of buildings and searching for people. The Yuneec company introduced new thermal sensors for both the UAV H520 and its modified version, H520E, i.e., the E10Tv and E20Tv cameras, in 2021. The E10T camera is still available for sale, and its advantage is a very attractive price in relation to new and competitive solutions. The low-cost argument strongly justifies undertaking and continuing research on the use of this product. The camera is not officially a radiometrically calibrated device [49], i.e., it lacks the ability to perform quantitative temperature characteristic studies using the recorded images. The official user manual also does not provide much information about how the camera works. When taking pictures, it is possible to view the temperature for the center of the image and the average value for the frame, and this information is saved in the metadata in the latest version of the camera software (in previous versions it was not possible). Camera software enables a certain correction of the influence of weather conditions on the temperature indicated by the sensor. The recording of a single exposure includes an image from an RGB camera in the JPG format, a raw thermal image in the TIFF format and a thermal image in a selected contrasting color palette (in the JPG format) with increased resolution twice. However, the image in the TIFF format is a 16-bit encoded image, which would indicate that it is an image related to the actual temperature values (encoded relatively in a specific range of image numerical values, so-called DN-digital numbers). Through contact with the technical support of Yuneec (for Europe), it was found that the file plays an indirect role in creating files in the JPG format. However, the practical use of the E10T camera and the study of radiometric characteristics [56] convinced the authors of the possibilities of using this camera in quantitative environmental studies. However, such use requires full knowledge about the capabilities and features of the camera (i.e., how external conditions affect its operation).
A frequent practice in the study of the natural environment is to record thermal images in an orderly manner, i.e., photographing in the form of aerial blocks composed of parallel lines with a specific sidelap and longitudinal coverage, completed with the creation of a thermal orthophotomap, similarly to photos in the visible range. Thermal orthophotos are used as additional sources of information in various spatial information systems, for example for city management or irrigation systems. However, the photogrammetric processing of thermal images is difficult due to the lower resolution and legibility of thermal images (for the E10T sensor, the RGB camera matrix is 1920 × 1080 pixels, and the thermal sensor matrix is 320 × 256). Photogrammetric processing of thermal images may be performed simultaneously in connection with RGB images [57].
The aim of this work was to perform the radiometric calibration of the E10T camera and to analyze the statistics of thermal image sets in order to assess the stability of the E10T radiometric camera operation. A 3D printer as a heated bed to a specific temperature was used for radiometric calibration. The analysis of statistical parameters of thermal images included sets recorded over various types of land cover, taken under various weather conditions. Furthermore, thermal images were taken in vertical profiles, analyzing the variability of the reference surface image on thermal images from different heights.

2. Materials and Methods

2.1. Drone and Thermal Sensor

The research used the E10T thermal camera dedicated to the professional UAV Yuneec H520. Detailed technical data on the E10T thermal camera with an increased-sensitivity RGB camera used in the tests are presented in Table 1. During the thermal imaging of natural, agricultural and urban areas, a high setting of high contrast (high gain) was used, due to the real variability. Temperatures fall within the sensitivity range of this mode, i.e., −25 to 100 °C. The Beurer FT 65 thermometer was used to measure the temperature of the reference surfaces, for which the range of measured temperatures varies from 0 to 100 degrees Celsius [58]. The user manual of thermal camera [59] recommends not to use the camera outside the specified range of air temperatures, not to expose the sensor to strong light sources and not to take pictures in rain and high humidity (without specifying the amount of air humidity, you can guess that it is very foggy and suspended water droplets in the air).

2.2. Radiometric Calibration

Radiometric calibration of the E10T thermal camera was performed using the heated bed of the Omni3d Factory 1.0 printer (3D printer). The software controlling the printer made it possible to manually set the temperature to which the heated bed was to be heated. The outside of the heated bed, observed and recorded by a thermal camera, is made of frosted glass. Before calibration, the 3D printer was leveled with a standard spirit level. In practice, heating the print base is necessary for stability and precise reproduction of the 3D printed object. The UAV H520 was attached to an aluminum frame placed above the printing platform as shown in Figure 2. The powered E10T camera was placed above the printing base at a height of 20 cm. In this case, it was assumed that such a distance has a negligible effect on the temperature measurement by the E10T thermal sensor. The camera was turned on for approximately 0.5 h before heating started. The verticality of the UAV H520 camera axis was set and was automatically maintained by the camera gimbal. The temperature of the printer base was varied from 18 to approximately 100 °C by heating increments of approximately 2 °C. Due to the fact that the heating process coexists with the process of external cooling of the print platform, the surface temperature was measured with a thermometer placed approximately 1 cm above the heated surface immediately before taking the thermal image, in a place corresponding to the center of the thermal image, approximately 2 min after the setting of a given temperature. Pictures with the thermal camera were taken manually using the H520 ST16s drone controller. Calibration only concerned the setting of the high gain–high sensitivity mode of the thermal sensor. The experiment was carried out in a large room with a constant temperature of around 18 °C. The calibrations were repeated three times using only the process of heating the base of the printer. In each series, the heating rate was performed according to the schedule established in the first series. After each series of measurements, the heated bed was cooled down to room temperature. On each thermal image associated with the measurement of the surface temperature of the printer base, in its center with a radius of 15 pixels, i.e., the area corresponding to the temperature field measured by the Beurer FT 65 thermometer above the print base, the DN values were averaged. The averaging was performed using the vector data model to define the range of the averaging field (circular polygon). Averaged DN values were calculated in the TNTmips version 2022 software from Landscan (US, San Luis Obispo, CA, local license for Adam Mickiewicz University). Then, the values of the temperature measured with a non-contact thermometer and the average DN values in the center of the thermal image were compared to establish the relationship and determine the functional character of this dependence. The determination of dependencies was made using the Excel software (Microsoft, US, Redmond, Office 365 license for Adam Mickiewicz University).

2.3. Analysis of Statistical Parameters of Thermal Image Sets

For blocks of thermal images, statistical parameters were calculated on DN values such as average, minimum and maximum. Blocks of photos taken over different areas at different times of the day and seasons, using one or more batteries to complete the entire survey, were selected for analysis. These areas include (1) the area of a small stream valley, covered with scattered vegetation, marked as location-L1, (2) the separate-family housing area, marked as location L2, and (3) the area of a field covered with rape cultivation, marked as location, L3. The photos were taken according to two scenarios: (1) flight along the designated route for high lateral and longitudinal coverage, usually amounting to 80 and 85%, respectively, and the automatic image recording resulting from these settings; (2) flight along the designated route for lateral and lateral coverage with manual shooting thermal (assuming a very high longitudinal coverage and checking if it is possible to take pictures). The photogrammetric processing of thermal images in the TIFF format to the form of orthophotomaps was carried out in Agisoft’s software—Metashape Professional, version 1.7.6 (Russia, St. Petersburg, local license for Adam Mickiewicz University). Standard processing steps include aligning photos (aerotriangulation, sensor autocalibration), generating a dense point cloud, calculating the surface model and orthorectifying photos, and assembling them into a continuous orthophotomap [60]. No tonal equalization methods were used when editing the photos and no vignetting effects were removed. For the visualization of the obtained orthophotos, the contrast stretching method was applied using the curve shape normalization method and the global linear stretching, based on the DN temperature range for all orthophotos.

2.4. Taking Photos in Vertical Profiles

The radiometric stability of the camera operation was analyzed by flying the drone over the same reference surface up and down, recording thermal images every ten meters up to a height of 120 m, while hovering (which in practice, taking into account the positioning accuracy of the UAV H520 given in the technical documentation, means absolute accuracy measuring the horizontal position up to approximately 2 m; the height above the surface was determined using data from the IMU). The overall flight time up and down twice was approximately 10 min. A fragment of the car parking with a uniform paving stone cover was selected as the reference area. On each thermal image, the extents of the reference surface was manually determined. The extent of this surface was saved as a polygon in the vector format. Then, for the range of the reference surface in all photos, the average value of the temperature in the form of DN was determined. The calculations were performed in the TNTmips software. The average temperatures were then compared with the height of the image registration in graphs to show the relationship between altitude and temperature using Microsoft’s Excel software.

3. Results and Discussion

During the radiometric calibration of the E10T camera with the use of the heated bed of the 3D printer, three series of measurements were carried out. Figure 3, part A, shows 45 thermal images from the first series, taken while heating the heated bed from 18 to 100 °C. For the entire set of photos, a uniform contrast was adopted using the linear method, in the full range of DN variation of the entire set of photos, in order to show temperature changes throughout the heating cycle. In the central part of each photo, the area of the actual temperature measurement with a thermometer and the calculation area of the average DN value is marked with a black circle. The thermal images in part A of Figure 3 show how the base of the print is heated, generally oval in nature, with a temperature drop towards the edges of the base of the print. To show the temperature variation in the full range (18–100 °C), a uniform contrast was set for the whole set of photos. In Figure 3, part B, the same set of photos is shown, changing the way of contrast to a method that normalizes the shape of the histogram (Gauss curve), doing this separately for each photo, which allows you to see the temperature distribution of the base of the print within the photo. These photos also show the variability of the heating of the print base as a function of the distance from the heater. The temperature measurement in the center of the image for calibration was correct because in the averaging area (black circle in the illustration), the temperature was quite stable, not affected by heating. Figure 3, part C, shows the statistics of consecutive thermal images, the mean brightness (mean) and standard deviation (SD) of all measurement series, calculated using histograms. By repeating the pictures according to the same heating scheme, it was possible to maintain a relatively linear, uniform temperature increase in the heating process.
Figure 4 shows the results of the radiometric calibration of the E10T camera in the form of the relationship between the temperature measured over the heated bed (thermometer) of the 3D printer and the DN values recorded on the images from the thermal camera (in the TIFF format). The presented results confirm the earlier simplified results of radiometric calibration [57] that between the temperature and DN values of the thermal image it is linear and is also similar to the results presented by Kelly et al. [61]. By adopting the formula of a simple linear equation, adjusting it to the analyzed variables, namely MT = a × DN + b, the functional forms were determined using the Excel software trend line tool. For all series of measurements, the obtained slope value of the linear equation was the same, a = 0.0078, and slightly different values of the intersection with the ordinate axis (coefficients b in the linear equation equation). The differences between the measured and predicted values using the obtained relationships, presented in the graphs in the lower part of Figure 4, do not exceed two degrees. According to the authors, such differences in relation to thermal images of the natural environment are acceptable.
The radiometric calibration presented in this paper is more accurate than the previous one made by the authors [56], but is also simple and made with the use of equipment available to the authors, and not as accurate as that presented in the work of Leblanc et al. [62]. The differences in the aspect of the calibration measurements concern the distance of the thermal camera from the calibration surface and the use of two thermal points for calibration. The different distance of the thermal sensor from the calibration surface causes a difference in the linear equation coefficient describing the relationship between the temperature of the calibration surface and the DN value in thermal images, 0.0078 in the current calibration and 0.0099 in the previous calibration [56]. In the previous calibration, the distance of the sensor from the reference surface was 2 m; in the current one, it was 20 cm—the conclusion may be trivial, but the 2 m thickness of the air layer in a closed room also affects the values recorded in the images.
Figure 5 presents six thermal orthophotomaps created from images taken with the E10T camera for two locations (L1 and L2), recorded in the autumn–winter period. The photos were taken at different times of the day, visually resulting in very different readability of the orthophotos. The orthophotomap presented in part A was prepared using photos taken in the morning over a built-up area (L1) with a cloudless sky. Its legibility can be described as high and related to the differences in temperature between buildings, namely the warmest and brightest are local heat sources characteristic of single-family housing. The orthophotomap in part B was prepared using photos taken after sunset, in the period when the objects begin to reflect the heat absorbed during the day, and its legibility, the contrast between the objects, is also related to the differences in the temperature of the objects. Parts C and F present an orthophotomap made of photos recorded in the middle of the day under full cloud cover. In the C orthophotomap, the cold watercourse stands out the most (the defect formed during the installation of thermal images is also visible). In the orthophotomap F, the stream is thermally similar to the trees. Orthophotomaps D and E were made using photos recorded under direct sunlight, and therefore the orthophotomaps show thermal differences between the surfaces illuminated by the sun’s rays and those located in the shade. The readability of the orthophotomap E is very similar to the image of orthophotomaps of photos taken in the visible range. Orthophotomap D, composed of photos taken closer to noon (11:55 AM), more clearly shows the differences in heating of various surfaces in relation to the topography (southern slopes are warmer, brighter). In addition, trees appear in it as bright spots.
Some of the orthophotomaps presented in Figure 5 show more general tonal differences, not related to the variation in the coverage of the photographed area. Namely, on the C, D, E, and F orthophotos, there are clear tonal differences between large fragments of the orthophotos. In the C, D, and E orthophotos, the northern part is clearly brighter (warmer) than the southern part. In the orthophotomap in part F, the northern part of the orthophotomap is dark. Orthophotomap B is tonally uniform. In orthophotos B and E, there are also visible defects related to the correct assembly of individual photos (this applies to the southern part of these orthophotos). After analyzing the order of the images, it turned out that the dark fragments of the orthophotomaps were created using photos recorded in the initial phase of the raids. This may indicate the radiometric instability of camera operation.
This is more precisely visible in Figure 6, which presents thermal images in the order of registration and with a uniform contrast setting for the entire set of photos. The illustration shows two seedlings performed directly in succession for the L3 site, which includes a rapeseed field with other crops. The photos are presented in this form because, despite many attempts, it was not possible to align the photos (aerotriangulation) and, consequently, the orthophotomaps were not created. The reasons for the impossibility of assembly are too great similarity of the content of thermal images, too low coverage between them and too low height of the flight above the ground—approximately 70 m.
To analyze more precisely and show the identified problem of radiometric instability of thermal camera operation, statistical parameters of thermal images were calculated (mean maximum and minimum). Figure 7 presents graphs of statistical parameters of photos according to the order of registration of the sets of photos presented in Figure 5 (charts A, B, C, D, E, and F) and Figure 6 (charts G and H). The instability of the thermal camera’s operation concerns the initial stage of the camera’s operation and is best visible on the graphs of the average brightness of subsequent photos, when the average is constantly increasing and it is not related to the temperature of the objects (it is best visible in the graphs D, E, F). In graphs G and H, graphs are presented for sets of photos taken for location L3 directly one after another—in graph H, when the device was after a longer period of operation and a short shutdown to change the battery, the initial instability is much less noticeable. A similar instability of the camera’s operation is reported by Olsson et al. [63] for the multispectral Sequoia camera and initial warming is suggested.
Figure 8 shows the reference surface temperature graphs obtained in the photos in vertical profiles up to a height of 120 m above this surface, in duplicate (profile-1: up/down/up; profile-2: up/down/up/down). Figure 9 shows the locations of the reference plot in a series of photos taken 10 to 120 m above the surface. The graphs show an increase in the reference surface temperature during the first climb. During the subsequent descent and climb steps, the measured reference surface temperature was relatively stable, within 1 degree. This slight variation in temperature can be attributed to the problem of identifying the extent of the reference surface in images of different heights and different spatial resolution.
With a single camera, it is difficult to say whether it is a defect of this unit or a property of the product, but it may be beneficial to other users of this type of camera, or to users of thermal cameras in general. The stated instability of the camera operation requires further tests (analyzes performed during subsequent raids). The solution may also be planning of the range of the photoblock so that this instability occurs outside the research area. Another idea may be to correct photos, e.g., with a Wallis filter [64].

4. Conclusions

The Yuneec E10T thermal camera, dedicated to the Yuneec H520 drone, has the possibility of practical use in quantitative research of the natural environment. Although it is sold as a product without radiometric calibration and is equipped with low-resolution matrices, the temperature measurement retains linearity. The E10T camera could be the first sensor that new users use to gain valuable experience in thermal measurement. After a simple calibration, it is possible to obtain data on the variation in temperature of natural surfaces. Performing classical photogrammetry with this camera requires appropriate planning of the height and mutual coverage of photos, in relation to the recorded content, in order to ensure the possibility of generating the necessary number of binding points in the photos during aerotriangulation. During the tests on the radiometric properties of the camera, operation instability was found in the initial period after switching on the camera. This instability can be avoided by appropriate flight planning or digitally corrected. This stated sensor instability may be an individual case, but a more general guideline for users of remote sensing sensors may be to conduct your own tests, including analysis of image statistics or shooting parameters, especially during the warranty period. Concern for the radiometric and geometric quality of thermal data is very much from the point of view of powering geographic information systems.

Author Contributions

Conceptualization, A.M., S.K. and M.K. and G.J.; methodology, A.M. and S.K.; software, G.J., S.K. and A.M.; validation, M.K., A.M. and S.K.; formal analysis, A.M. and S.K.; investigation, A.M., S.K., M.K. and G.J.; resources, A.M., S.K. and G.J.; data curation, A.M., M.K. and S.K.; writing, S.K. and A.M.; writing—review and editing, A.M., S.K., M.K. and G.J.; visualization, S.K. and A.M.; supervision, S.K. and A.M.; project administration, A.M.; funding acquisition, A.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by “UNIWERSYTET JUTRA II—zintegrowany program rozwoju Uniwersytetu im. Adama Mickiewicza w Poznaniu”, no. POWR.03.05.00-00-Z303/17, co-financed by the European Social Fund under the Knowledge Education Development Operational Program (POWER) of Priority Axis III Higher Education for Economy and Development, measures 3.5 Comprehensive programs of universities.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

  1. Prakash, A. Thermal remote sensing: Concepts, issues and applications. Int. Arch. Photogramm. Rem. Sens. 2020, 33, 239–243. [Google Scholar]
  2. Dlesk, A.; Vach, K.; Holubec, P. Usage of photogrammetric processing of thermal images for civil engineers. The International Archives of the Photogrammetry. Remote Sens. Spat. Inf. Sci. 2018, 42, 99–103. [Google Scholar]
  3. Alchanatis, V.; Cohen, Y.; Cohen, S.; Moller, M.; Sprinstin, M.; Meron, M. Evaluation of different approaches for estimating and mapping crop water status in cotton with thermal imaging. Precis. Agric. 2010, 11, 27–41. [Google Scholar] [CrossRef]
  4. Guilioni, L.; Jones, H.G.; Leinonen, I.; Lhomme, J.P. On the relationships between stomatal resistance and leaf temperatures in thermography. Agric. For. Meteorol. 2008, 148, 1908–1912. [Google Scholar] [CrossRef]
  5. Leinonen, I.; Grant, O.M.; Tagliavia, C.P.P.; Chaves, M.M.; Jones, H.G. Estimating stomatal conductance with thermal imagery. Plant Cell Environ. 2006, 29, 1508–1518. [Google Scholar] [CrossRef]
  6. Sanna, A.; Lamberti, F. Editorial Advances in Target Detection and Tracking in Forward-Looking InfraRed (FLIR) Imagery. Sensors 2014, 14, 20297–20303. [Google Scholar] [CrossRef] [Green Version]
  7. Boonea Zhua, C.; Smith, C.; Todd, I.; Willmott, J.R. Thermal near infrared monitoring system for electron beam melting with emissivity tracking. Addit. Manuf. 2018, 22, 601–605. [Google Scholar]
  8. Maes, W.; Steppe, K. Perspectives for Remote Sensing with Unmanned Aerial Vehicles in Precision Agriculture. Trends Plant Sci. 2019, 24, 152–164. [Google Scholar] [CrossRef] [PubMed]
  9. Messina, G.; Modica, G. Applications of UAV Thermal Imagery in Precision Agriculture: State of the Art and Future Research Outlook Gaetano. Remote Sens. 2020, 12, 1491. [Google Scholar] [CrossRef]
  10. Harvey, M.; Pearson, S.; Alexander, K.B.; Rowland, J.; White, P. Unmanned aerial vehicles (UAV) for cost effective aerial orthophotos and digital surface models (DSMs). In Proceedings of the New Zealand Geothermal Workshop 2014 Proceedings, Auckland, New Zealand, 24–26 November 2014. [Google Scholar]
  11. Harvey, M.; Rowland, J.V.; Luketina, K.M. Drone with thermal infrared camera provides high resolution georeferenced imagery of the Waikite geothermal area, New Zealand. J. Volcanol. Geoth. Res. 2016, 325, 61–69. [Google Scholar] [CrossRef]
  12. Blaya-Ros, P.J.; Blanco, V.; Domingo, R.; Soto-Valles, F.; Torres-Sánchez, R. Feasibility of low-cost thermal imaging for monitoring water stress in young and mature sweet cherry trees. Appl. Sci. 2020, 10, 5461. [Google Scholar] [CrossRef]
  13. Valente, J.; Roldán, J.; Garzón, M.; Barrientos, A. Towards Airborne Thermography via Low-Cost Thermopile Infrared Sensors. Drones 2019, 3, 30. [Google Scholar] [CrossRef] [Green Version]
  14. Osroosh, Y.; Khot, L.R.; Peters, R.T. Economical thermal-RGB imaging system for monitoring agricultural crops. Comput. Electron. Agric. 2018, 147, 34–43. [Google Scholar] [CrossRef]
  15. Martinez-Jimenez, M.; Loza-Gonzalez, V.; Kolosovas-Machuca, S.; Yanes-Lane, M.; Ramirez-GarciaLuna, A.; Ramirez-GarciaLuna, J. Diagnostic accuracy of infrared thermal imaging for detecting COVID-19 infection in minimally symptomatic patients. Eur. J. Clin. Investig. 2021, 51, e13474. [Google Scholar] [CrossRef] [PubMed]
  16. Brooke, C. Thermal Imaging for the Archeological Investigation of Historic Buildings. Remote Sens. 2018, 10, 1401. [Google Scholar] [CrossRef] [Green Version]
  17. Agudo, P.U.; Pajas, J.A.; Pérez-Cabello, F.; Redón, J.V.; Lebrón, B.E. The potential of drones and sensors to enhance detection of archaeological cropmarks: A comparative study between multi-spectral and thermal imagery. Drones 2018, 2, 29. [Google Scholar] [CrossRef] [Green Version]
  18. Šedina, J.; Housarová, E.; Raeva, P. Using RPAS for the detection of archaeological objects using multispectral and thermal imaging. Eur. J. Remote Sens. 2019, 52 (Suppl. 1), 182–191. [Google Scholar] [CrossRef] [Green Version]
  19. Monterroso-Checa, A.; Redondo-Villa, A.; Gasparini, M.; Hornero, A.; Iraci, B.; Martín-Talaverano, R.; Moreno-Escribano, J.C.; Muñoz-Cádiz, J.; Murillo-Fragero, J.I.; Obregón-Romero, R.; et al. A heritage science workflow to preserve and narrate a rural archeological landscape using virtual reality: The cerro del castillo of belmez and its surrounding environment (Cordoba, Spain). Appl. Sci. 2020, 10, 8659. [Google Scholar] [CrossRef]
  20. Nandhithaa, N.M.; Roslinb, S.E.; Chakravarthib, R.; Sangeethab, M.S. Feasibility of Infrared Thermography for Health Monitoring of Archeological Structures. Smart Intell. Comput. Commun. Technol. 2021, 38, 111. [Google Scholar]
  21. Feltynowski, M.; Zawistowski, M. Opportunities Related to the Use of Unmanned Systems in Emergency Services; Scientific and Research Centre for Fire Protection–National Research Institute, Safety & Fire Technique: Poland, Józefów, 2018; Volume 51, pp. 126–133. [Google Scholar]
  22. Ambrosia, V.; Wegener, S.; Sullivan, D.; Buechel, S.; Dunagan, S.; Brass, J.; Stoneburner, J.; Schoenung, S. Demonstrating UAV-Acquired Real-Time Thermal Data over Fires. Photogramm. Eng. Remote Sens. 2003, 69, 391–402. [Google Scholar] [CrossRef]
  23. Andraši, P.; Radišić, T.; Muštra, M.; Ivošević, J. Night-time detection of uavs using thermal infrared camera. Transp. Res. Procedia 2017, 28, 183–190. [Google Scholar] [CrossRef]
  24. Nithyavathy, N.; Kumar, S.A.; Rahul, D.; Kumar, B.S.; Shanthini, E.R.; Naveen, C. Detection of fire prone environment using Thermal Sensing Drone. IOP Conf. Ser. Mater. Sci. Eng. 2021, 1055, 012006. Available online: https://iopscience.iop.org/article/10.1088/1757-899X/1055/1/012006/pdf (accessed on 22 January 2022). [CrossRef]
  25. Allison, R.S.; Johnston, J.M.; Craig, G.; Jennings, S. Airborne optical and thermal remote sensing for wildfire detection and monitoring. Sensors 2016, 16, 1310. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  26. Riggan, P.J.; Hoffman, J.W. FireMapper™: A thermal-imaging radiometer for wildfire research and operations. In Proceedings of the IEEE Aerospace Conference, Big Sky, MT, USA, 8–15 March 2003. [Google Scholar]
  27. Dugdale, S.J.; Kelleher, C.A.; Malcolm, I.A.; Caldwell, S.; Hannah, D.M. Assessing the potential of drone-based thermal infrared imagery for quantifying river temperature heterogeneity. Hydrol. Processes 2019, 33, 1152–1163. [Google Scholar] [CrossRef]
  28. McDonald, W. Drones in urban stormwater management: A review and future perspectives. Urban Water J. 2019, 16, 505–518. [Google Scholar] [CrossRef]
  29. Vélez-Nicolás, M.; García-López, S.; Barbero, L.; Ruiz-Ortiz, V.; Sánchez-Bellón, A. Applications of Unmanned Aerial Systems (UASs) in Hydrology: A Review. Remote Sens. 2021, 13, 1359. [Google Scholar] [CrossRef]
  30. Chung, M.; Detweiler, C.; Hamilton, M.; Higgins, J.; Ore, J.P.; Thompson, S. Obtaining the thermal structure of lakes from the air. Water 2015, 7, 6467–6482. [Google Scholar] [CrossRef] [Green Version]
  31. Webster, C.; Westoby, M.; Rutter, N.; Jonas, T. Three-dimensional thermal characterization of forest canopies using UAV photogrammetry. Remote Sens. Environ. 2018, 209, 835–847. [Google Scholar] [CrossRef] [Green Version]
  32. Baratchi, M.; Meratnia, N.; Havinga, P.J.; Skidmore, A.; Toxopeus, B.A. Sensing solutions for collecting spatio-temporal data for wildlife monitoring applications. Sensors 2013, 13, 6054–6088. [Google Scholar] [CrossRef] [Green Version]
  33. Sykes, D.J.; Couvillion, J.S.; Cromiak, A.; Bowers, S.; Schenck, E.; Crenshaw, M.; Ryan, P.L. The use of digital infrared thermal imaging to detect estrus in gilts. Theriogenology 2012, 20278, 147–152. [Google Scholar] [CrossRef]
  34. Shariq, M.H.; Hughes, B.R. Revolutionising building inspection techniques to meet large-scale energy demands: A review of the state-of-the-art. Renew. Sustain. Energy Rev. 2020, 130, 109979. [Google Scholar] [CrossRef]
  35. Haichao, Z.; Xue, Z.; Junru, Y.; Lihua, Z.; Xintian, W. A Thermal Performance Detection Method for Building Envelope Based on 3D Model Generated by UAV Thermal Imagery. Energies 2020, 13, 6677. [Google Scholar]
  36. Rakha, T.; Gorodetsky, A. Review of Unmanned Aerial System (UAS) applications in the built environment: Towards automated building inspection procedures using drones. Autom. Constr. 2018, 93, 252–264. [Google Scholar] [CrossRef]
  37. Yalong, M.; Xinkai, W.; Guizhen, Y.; Yongzheng, X.; Yunpeng, W. Pedestrian Detection and Tracking from Low-Resolution Unmanned Aerial Vehicle Thermal Imagery. Sensors 2016, 16, 00446. [Google Scholar]
  38. Ishimwe, R.; Abutaleb, K.; Ahmed, F. Applications of Thermal Imaging in Agriculture—A Review. Adv. Remote Sens. 2014, 3, 128–140. [Google Scholar] [CrossRef] [Green Version]
  39. Vadivambal, R.; Jayas, D.S. Applications of thermal imaging in agriculture and food industry—A review. Food Bioprocess Technol. 2011, 4, 186–199. [Google Scholar] [CrossRef]
  40. Delavarpour, N.; Koparan, C.; Nowatzki, J.; Bajwa, S.; Sun, X. A Technical Study on UAV Characteristics for Precision Agriculture Applications and Associated Practical Challenges. Remote Sens. 2021, 13, 1204. [Google Scholar] [CrossRef]
  41. Krishna, G.; Sahoo, R.N.; Singh, P.; Patra, H.; Bajpai, V.; Das, B.; Kumar, S.; Dhandapani, R.; Vishwakarma, C.; Pal, M.; et al. Application of thermal imaging and hyperspectral remote sensing for crop water deficit stress monitoring. Geocarto Int. 2021, 36, 481–498. [Google Scholar] [CrossRef]
  42. Han, Y.; Tarakey, B.A.; Hong, S.J.; Kim, S.Y.; Kim, E.; Lee, C.H.; Kim, G. Calibration and Image Processing of Aerial Thermal Image for UAV Application in Crop Water Stress Estimation. J. Sens. 2021, 2021, 5537795. [Google Scholar] [CrossRef]
  43. Zhu, W.; Chen, H.; Ciechanowska, I.; Spaner, D. Application of infrared thermal imaging for the rapid diagnosis of crop disease. IFAC-Pap. 2018, 51, 424–430. [Google Scholar] [CrossRef]
  44. Banerjee, K.; Krishnan, P.; Mridha, N. Application of thermal imaging of wheat crop canopy to estimate leaf area index under different moisture stress conditions. Biosyst. Eng. 2018, 166, 13–27. [Google Scholar] [CrossRef]
  45. Quebrajo, L.; Perez-Ruiz, M.; Pérez-Urrestarazu, L.; Martínez, G.; Egea, G. Linking thermal imaging and soil remote sensing to enhance irrigation management of sugar beet. Biosyst. Eng. 2018, 165, 77–87. [Google Scholar] [CrossRef]
  46. Roopaei, M.; Rad, P.; Choo KK, R. Cloud of things in smart agriculture: Intelligent irrigation monitoring by thermal imaging. IEEE Cloud Comput. 2017, 4, 10–15. [Google Scholar] [CrossRef]
  47. Egea, G.; Padilla-Díaz, C.M.; Martinez-Guanter, J.; Fernández, J.E.; Pérez-Ruiz, M. Assessing a crop water stress index derived from aerial thermal imaging and infrared thermometry in super-high density olive orchards. Agric. Water Manag. 2017, 187, 210–221. [Google Scholar] [CrossRef] [Green Version]
  48. Tucci, G.; Parisi, E.; Castelli, G.; Errico, A.; Corongiu, M.; Sona, G.; Viviani, E.; Bresci, E.; Preti, F. Multi-Sensor UAV Application for Thermal Analysis on a Dry-Stone Terraced Vineyard in Rural Tuscany Landscape. SPRS Int. J. Geo-Inf. 2019, 8, 87. [Google Scholar]
  49. Ortiz-Sanz, J.; Gil-Docampo, M.; Arza-García, M.; Cañas-Guerrero, I. IR thermography from UAVs to monitor thermal anomalies in the envelopes of traditional wine cellars: Field test. Remote Sens. 2019, 11, 1424. [Google Scholar] [CrossRef]
  50. Costa, J.M.; Grant, O.M.; Chaves, M.M. Use of thermal imaging in viticulture: Current application and future prospects. In Methodologies and Results in Grapevine Research; Springer: Dordrecht, The Netherlands, 2010; pp. 135–150. [Google Scholar]
  51. Zhou, Z.; Majeed, Y.; Naranjo, G.D.; Gambacorta, E. Assessment for crop water stress with infrared thermal imagery in precision agriculture: A review and future prospects for deep learning applications. Comput. Electron. Agric. 2021, 182, 106019. [Google Scholar] [CrossRef]
  52. Ivushkin, K.; Bartholomeus, H.; Bregt, A.K.; Pulatov, A.; Franceschini, M.H.; Kramer, H.; van Loo, E.N.; Roman, V.J.; Finkers, R. UAV based soil salinity assessment of cropland. Geoderma 2019, 338, 502–512. [Google Scholar] [CrossRef]
  53. Estrada-Pérez, L.V.; Pradana-López, S.; Pérez-Calabuig, A.M.; Mena, M.L.; Cancilla JC Torrecilla, J.S. Thermal imaging of rice grains and flours to design convolutional systems to ensure quality and safety. Food Control 2021, 121, 107572. [Google Scholar] [CrossRef]
  54. Nguyen TX, B.; Rosser, K.; Chahl, J. A Review of Modern Thermal Imaging Sensor Technology and Applications for Autonomous Aerial Navigation. J. Imaging 2021, 7, 217. [Google Scholar] [CrossRef]
  55. Berni, J.; Zarco-Tejada, P.; Suárez, L.; González-Dugo, V.; Fereres, E. Remote sensing of vegetation from UAV platforms using lightweight multispectral and thermal imaging sensors. Int. Arch. Photogramm. Remote Sens. Spatial Inform. Sci. 2009, 38, 6. [Google Scholar]
  56. Młynarczyk, A.; Królewicz, S. Radiometric calibration of the E10T thermal camera. In The Natural Environment as an Area of Research; Młynarczyk, A., Ed.; Bogucki Wydawnictwo Naukowe: Poznań, Poland, 2021; pp. 49–58. ISBN 978-83-7986-357-0. [Google Scholar]
  57. López, A.; Jurado, J.M.; Ogayar, C.J.; Feito, F.R. An optimized approach for generating dense thermal point clouds from UAV-imagery. ISPRS J. Photogramm. Remote Sens. 2021, 182, 78–95. [Google Scholar] [CrossRef]
  58. User Manual Baurer FT65. Available online: https://www.beurer.com/web/pl/products/medical/fever-thermometers/ft-65.php (accessed on 17 January 2022).
  59. Quick Start Guide Yuneec E10T. Available online: https://temporalwebdownload.s3.eu-central-1.amazonaws.com/CAMERAS/E10T/E10T%2BQuick%2BStart%2BGuide%2B(EN%2C%2BDE%2C%2BFR%2C%2BIT%2C%2BES%2C%2BFIN%2C%2BCN).zip (accessed on 18 January 2022).
  60. Agisoft. Thermal Imagery Processing. Modified on: Wed, 24 March 2021. 2021. Available online: https://agisoft.freshdesk.com/support/solutions/articles/31000158942-thermal-imagery-processing (accessed on 28 May 2022).
  61. Kelly, J.; Kljun, N.; Olsson, P.O.; Mihai, L.; Liljeblad, B.; Weslien, P.; Eklundh, L. Challenges and best practices for deriving temperature data from an uncalibrated UAV thermal infrared camera. Remote Sens. 2019, 11, 567. [Google Scholar] [CrossRef] [Green Version]
  62. Leblanc, G.; Kalacska, M.; Arroyo-Mora, J.P.; Lucanus, O.; Todd, A. A Practical Validation of Uncooled Thermal Imagers for Small RPAS. Drones 2021, 5, 132. [Google Scholar] [CrossRef]
  63. Olsson, P.O.; Vivekar, A.; Adler, K.; Garcia Millan, V.E.; Koc, A.; Alamrani, M.; Eklundh, L. Radiometric correction of multispectral uas images: Evaluating the accuracy of the parrot sequoia camera and sunshine sensor. Remote Sens. 2021, 13, 577. [Google Scholar] [CrossRef]
  64. Tan, D. Image enhancement based on adaptive median filter and Wallis filter. In Proceedings of the National Conference on Electrical, Electronics and Computer Engineering, Xi’an, China, 12–13 December 2015; pp. 12–13. [Google Scholar]
Figure 1. Examples of various pairs of RGB and thermal images taken with the E10T camera. Below the images some information is given about the geographic and meteorological parameters (obtained from the closest station).
Figure 1. Examples of various pairs of RGB and thermal images taken with the E10T camera. Below the images some information is given about the geographic and meteorological parameters (obtained from the closest station).
Remotesensing 14 02633 g001
Figure 2. H520 drone with the E10T camera. The drone is attached to the radiometric calibration over the heated bed of the Omni3d Factory 1.0 printer.
Figure 2. H520 drone with the E10T camera. The drone is attached to the radiometric calibration over the heated bed of the Omni3d Factory 1.0 printer.
Remotesensing 14 02633 g002
Figure 3. Radiometric calibration with the use of a 3D printer base: (A) thermal images recorded during the first calibration series, while heating the base of the 3D printer printout from 18 to 100 °C, arranging the images along with the temperature increase from the upper left corner to the lower right corner; in the central part, the area for which mean temperature values were calculated is marked with a black dot. DN values for all TIFF images recorded in 16-bit coding varied from 24,000 to 34,500; in this range, linear contrast stretching was used and visualization was with the same color palette. (B) The same thermal images with the changed method of contrast stretching to the method normalizing the shape of the histogram, separately for each image. (C) Graphs of image statistics, mean thermal brightness and standard deviation and their variability over time for three calibration series.
Figure 3. Radiometric calibration with the use of a 3D printer base: (A) thermal images recorded during the first calibration series, while heating the base of the 3D printer printout from 18 to 100 °C, arranging the images along with the temperature increase from the upper left corner to the lower right corner; in the central part, the area for which mean temperature values were calculated is marked with a black dot. DN values for all TIFF images recorded in 16-bit coding varied from 24,000 to 34,500; in this range, linear contrast stretching was used and visualization was with the same color palette. (B) The same thermal images with the changed method of contrast stretching to the method normalizing the shape of the histogram, separately for each image. (C) Graphs of image statistics, mean thermal brightness and standard deviation and their variability over time for three calibration series.
Remotesensing 14 02633 g003
Figure 4. Radiometric calibration of the E10T camera obtained using the heated base of the Omni3d Factory 1.0 printer. The upper part of the illustration shows the relationship between the temperature measured on the surface of the heated base and the DN values read on the thermal images. In the lower part of the figure, the difference between the measured temperature and the one estimated using the established dependence is presented.
Figure 4. Radiometric calibration of the E10T camera obtained using the heated base of the Omni3d Factory 1.0 printer. The upper part of the illustration shows the relationship between the temperature measured on the surface of the heated base and the DN values read on the thermal images. In the lower part of the figure, the difference between the measured temperature and the one estimated using the established dependence is presented.
Remotesensing 14 02633 g004
Figure 5. Thermal orthophotomaps computed for the sets of images taken with the E10T camera using Agisoft Metashape software for the location L1 (1 term, subfigure (A)) and locations L2 (five terms, subfigures (BF)). Orthophotomaps are presented in two ways: using the contrast normalized for each orthophotomap (upper part of the illustration) and linearly defined contrast for full variation DN values of the orthophotomap set (lower part of the illustration). The pictures were taken from the height set to 100 m AGL. Orthophotos are oriented in real geographic directions.
Figure 5. Thermal orthophotomaps computed for the sets of images taken with the E10T camera using Agisoft Metashape software for the location L1 (1 term, subfigure (A)) and locations L2 (five terms, subfigures (BF)). Orthophotomaps are presented in two ways: using the contrast normalized for each orthophotomap (upper part of the illustration) and linearly defined contrast for full variation DN values of the orthophotomap set (lower part of the illustration). The pictures were taken from the height set to 100 m AGL. Orthophotos are oriented in real geographic directions.
Remotesensing 14 02633 g005
Figure 6. Two sets of thermal images arranged according to the order of registration for location L3 (rape field). The graphs of the statistical parameters marked with the letters G and H correspond to them in Figure 7. The same contrast table was set up for both sets using the linear method, using the minimum and maximum DN values of the entire set.
Figure 6. Two sets of thermal images arranged according to the order of registration for location L3 (rape field). The graphs of the statistical parameters marked with the letters G and H correspond to them in Figure 7. The same contrast table was set up for both sets using the linear method, using the minimum and maximum DN values of the entire set.
Remotesensing 14 02633 g006
Figure 7. Graphs showing statistical parameters of sets including average, minimum and maximum values, of consecutive photos in individual sets according to the order of taking. The markings of the subsequent (AH) charts correspond to the photo set identifiers used in Figure 5 and Figure 6.
Figure 7. Graphs showing statistical parameters of sets including average, minimum and maximum values, of consecutive photos in individual sets according to the order of taking. The markings of the subsequent (AH) charts correspond to the photo set identifiers used in Figure 5 and Figure 6.
Remotesensing 14 02633 g007
Figure 8. The temperature graphs of the reference surface obtained in the photos in vertical profiles up to a height of 120 m above this surface, in duplicate (profile-1: up/down/up; profile-2: up/down/up/down). The first profile was made on 4 January, and the second profile was made on 6 January 2022. For the first flight, no thermal images were taken during the second downhill series). Temperature determined by radiometric calibration equations.
Figure 8. The temperature graphs of the reference surface obtained in the photos in vertical profiles up to a height of 120 m above this surface, in duplicate (profile-1: up/down/up; profile-2: up/down/up/down). The first profile was made on 4 January, and the second profile was made on 6 January 2022. For the first flight, no thermal images were taken during the second downhill series). Temperature determined by radiometric calibration equations.
Remotesensing 14 02633 g008
Figure 9. An example of a series of photos taken above the reference surface (black polygon) and the method of measuring its temperature taking into account the shrinkage of its image as in subsequent photos recorded from a higher altitude (from 10 to 120 m AGL).
Figure 9. An example of a series of photos taken above the reference surface (black polygon) and the method of measuring its temperature taking into account the shrinkage of its image as in subsequent photos recorded from a higher altitude (from 10 to 120 m AGL).
Remotesensing 14 02633 g009
Table 1. Parameters of the E10T thermal camera (Yuneec H520) [59].
Table 1. Parameters of the E10T thermal camera (Yuneec H520) [59].
Parameter\SensorRGBThermal
Resolution (pixels)1920 × 1080 320 × 256
Field of View (FOV)89.6°34°
Focal length (mm)3.54.3
The physical dimension of a pixel (mm)2.312 (6 enhanced JPG)
Wavelength0.45–0.77 μm8–14 μm
SensitivityISO range:
100–3200
Shutter speed:
1/30–1/8000 s
<50 mK, @f/1.0
Sensor typeCMOS 1/2,8”Uncooled Vox microbolometer (FLIR)
Scene temperature range High gain −25 to 100 °C
Low gain −40 to 550 °C
Calibration optionsn/aAtmospheric parameters:
-
Scene emissivity
-
Conversion coefficient
-
Atmospheric temperature
Color space and recording data formatRGB 24 bit, JPGTIFF 16-bit (not radiometric),
Pallete color JPEG (enhanced resolution 640 × 512)
Operating temperature range−10 to 40 °C−10 to 40 °C
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Młynarczyk, A.; Królewicz, S.; Konatowska, M.; Jankowiak, G. Experience Gained When Using the Yuneec E10T Thermal Camera in Environmental Research. Remote Sens. 2022, 14, 2633. https://doi.org/10.3390/rs14112633

AMA Style

Młynarczyk A, Królewicz S, Konatowska M, Jankowiak G. Experience Gained When Using the Yuneec E10T Thermal Camera in Environmental Research. Remote Sensing. 2022; 14(11):2633. https://doi.org/10.3390/rs14112633

Chicago/Turabian Style

Młynarczyk, Adam, Sławomir Królewicz, Monika Konatowska, and Grzegorz Jankowiak. 2022. "Experience Gained When Using the Yuneec E10T Thermal Camera in Environmental Research" Remote Sensing 14, no. 11: 2633. https://doi.org/10.3390/rs14112633

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop