Название: Urban Remote Sensing
Автор: Группа авторов
Издательство: John Wiley & Sons Limited
Жанр: География
isbn: 9781119625858
isbn:
An exception to the simplicity of this stage of data collection is when high accuracy ground control validation is required for a project. For many projects, especially those where locational data are not needed, image georeferencing is not required. However, many UAS utilize Global Navigation Satellite System (GNSS) receivers providing positional information in real‐time to the operator. The inclusion of GNSS receivers in UAS also enables the imagery collected by the onboard sensors to be georeferenced. This process of image georeferencing requires the GNSS receiver to measure the coordinates of the UAS at the exact moment an image was collected. Due to the high speed of the UAS and image collection process, it is often difficult for the GNSS receiver to perfectly synchronize with the camera. According to Sanz‐Ablanedo et al. (2018), the method of georeferencing images with the onboard GNSS receiver can only achieve accuracies in the decimeter to meter range. These accuracies can be improved, however, with the inclusion of independent ground control points (GCPs) in the 3D dataset. For instance, Turner et al. (2012) demonstrated that when utilizing only the UAS’s onboard GNSS for georeferencing, the mean absolute total error of their two study locations was 1.247 and 0.665 m, respectively. But when independent GCPs were included in the same image datasets, the location error dropped to 0.129 and 0.103 m, respectively. This suggests that including independent GCPs can significantly improve locational accuracies in 2D and 3D datasets (Cryderman et al., 2014; Agüera‐Vega et al., 2017; Liu et al., 2018). There has also been more research conducted in recent years on the effectiveness of the integration of real‐time kinematic (RTK) and post‐processing kinematic (PPK) technology with GNSS receivers onboard UAS, and some researchers demonstrated the possibility of obtaining sub‐decimeter spatial accuracies without using independent GCPs (Forlani et al., 2018; Gabrlik et al., 2018; Zhang et al., 2019). The decision of whether independent GCPs (or a high‐quality onboard GNSS receiver) should be included in a remote sensing project depends on the accuracy required for the project. Not all projects require sub‐decimeter accuracies, so it might be more feasible (and cost‐effective) for individuals to utilize UAS with only standard quality GNSS receivers that provide lower positional accuracies.
3.3.3 DATA PROCESSING (POSTFLIGHT)
Once all image data have been collected and organized from a UAS, they can be processed into other geospatial products depending on the sensor type and flight plans used. For a RGB sensor, individuals can utilize the images collected to generate a multitude of photogrammetric outputs, such as orthophotos (Strecha et al., 2012; Hardy et al., 2017), DEMs (Kršák et al., 2016; Agüera‐Vega et al., 2017), and 3D point clouds (Siebert and Teizer, 2014; Trhan et al., 2016). The methods to generate these products from UAS imagery have improved significantly over the last decade with the introduction of photogrammetric software for use with UAS data (Colomina and Molina, 2014; Yao et al., 2019). Built on the concepts of Structure from Motion (SfM) and multi‐view stereo (MVS) photogrammetry, these software packages can process UAS data to derive 2D/3D geospatial outputs. Examples of these software packages include Pix4DMapper (www.pix4d.com), DroneDeploy (www.dronedeploy.com), Agisoft Metashape (www.agisoft.com), ESRI Drone2Map (www.esri.com/en‐us/arcgis/products/drone2map), and 3DF Zephyr (www.3dflow.net/3df‐zephyr‐pro‐3d‐models‐from‐photos/). In addition to the commercial software packages, there are open‐source options providing similar photogrammetric capabilities, such as OpenDroneMap (www.opendronemap.org) and MICMAC (www.micmac.ensg.eu), although they are not as user‐friendly compared to the commercial software packages. Since most of these packages have become viable for generating robust geospatial outputs with UAS images in recent years, there is a lack of a firm understanding of the impact of the specific software’s algorithms on model outputs (Lavecchia et al., 2017). In addition, there has been a notable lack of cross‐comparisons of output quality generated among different photogrammetric software packages. There has been some efforts targeting this issue by comparing specific packages (Alidoost and Arefi, 2017; Barbasiewicz et al., 2018; Gagliolo et al., 2018; Brach et al., 2019; Forsmoo et al., 2019), but there is still a need to continue investigating the performance of these different software platforms.
Although there is a variety of UAS‐oriented photogrammetric software packages available, they can generate several common outputs, such as 3D point clouds, orthophoto maps, DEMs (surface and/or terrain), and 3D textured models. Fortunately, photogrammetric software packages utilize relatively common conceptual workflows, although their specific terms/steps may vary. By utilizing SfM and MVS algorithms, photogrammetric software packages can automatically extract 3D features from the UAS images by finding common feature points between overlapping images (Lowe, 2004). According to Remondino et al. (2011), a typical UAS photogrammetry workflow includes the following: mission planning, ground control measurements (if needed), image acquisition, camera calibration, image orientation, and image processing to generate 3D data.
3.4 UAS FOR URBAN APPLICATIONS
The unique ability of UAS to collect data at user‐defined spectral, spatial, and temporal resolutions offers new possibilities for urban applications that require rapid mapping, assessment, or management. Noor et al. (2018) summarized several types of information that can be provided by UAS for urban studies, including urban location and extent, land cover and land use information, transportation networks and infrastructure, census‐related statistics, and land change information. Despite that the use of UAS in urban areas is currently limited by regulations and safety concerns, numerous attempts have been made to explore the potential of UAS to support urban studies. This section discusses four common types of urban applications using UAS, including disaster response, infrastructure inspection, hyper‐local ecological information collection, and the construction of smart cities.
3.4.1 DISASTER RELIEF EFFORTS
Remotely sensed imagery provides invaluable information for rescue and relief efforts, damage assessment, and the planning of remedial measures (Hussain et al., 2011). With UAS, disaster zones can be observed and assisted promptly. Aljehani and Inoue (2019) designed a system that incorporates UAS scanning and UAS tracking to generate a safe map with potential routes for refugee evacuation. They have successfully utilized UAS to detect and track pedestrians by the Haar‐cascade and Kalman filter classification method, which can serve as a substitute for participatory tracking (e.g. mobile tracking) when pedestrians are not carrying mobile devices. Practically, they proposed using mobile networks for communication when the standard communication channels are completely lost. They also stressed the importance of human operators in the evaluation process and mission management СКАЧАТЬ