Section D

# Research on Smart Construction Education Training Contents Using a Drone Simulator

Jinhee Won1, Giho Hong1,*
1Research and Development Center, D2Innovation, Seongnam, Korea, gniwon@d2innovation.co.kr, zeno@d2innovation.co.kr
*Corresponding Author: Giho Hong, +82-10-3038-2625, zeno@d2innovation.co.kr

© Copyright 2022 Korea Multimedia Society. This is an Open-Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/4.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.

Received: Nov 30, 2022; Revised: Dec 19, 2022; Accepted: Dec 21, 2022

Published Online: Dec 31, 2022

## Abstract

Recently, application field of drone is expending throughout the growth of drone technology in the structural change of whole society which calls the 4th industrial Revolution. So, the construction field has been already using drone video shooting in construction design and efforts to introduce drone, which is one of smart construction technology, are also increasing even in facility maintenance management. It takes so much time and costs to train drone experts who are required in the smart construction field so training contents studies, which are for smart construction education, demand to acquire professional skills and expend education opportunities using a drone simulator. Therefore, we present real-world-images based on spatial information for realistic images using the flight dynamics modeling of a quad rotor drone and render a smart construction drone simulator for drone pilots education training to inspect facilities safety.

Keywords: High Precision 3D Spatial Information; Drone Simulator; Smart Construction

## I. INTRODUCTION

It approves the utility of Unmanned Aerial Vehicle (UAV), defined drone, developed for military purposes. And recently, it is expending in a variety of fields such as taking videos, the construction industry, the farming industry with the growth of hobby drone. The application of drones is increasing throughout the industry such as intellectualization, digitization, and a diversity of technology fusions in Industry 4.0, led by development of AI, Robot technology and Bioscience [1-2].

Smart construction introduction has been actively processing in traditional construction skills fused with high technology such as BIM, IoT, Big data, Robot etc. Particularly, taking videos by drones use the survey and construction site management. Despite of increasing drone demands, because of time, high costs by labor-intensive management, workers aging and risk response that there are difficulties to spread drone usage due to the fact a lack of drone experts and utilize capability in the construction site. It is necessary to research on training contents using drone simulators to prevent safety concerns caused by drone flights and reduce time and cost with development of construction steps (design–construction-maintenance) training scenarios for the training drone experts.

This research presented a drone simulator applied flight kinetics characteristics for drone operational environment and a method that renders real-world-images by modeling real terrain using 3D high precision spatial information. We developed smart construction drone training scenarios using this drone simulator. Then the efficacy of the drone simulator education verified by using practical demonstrations from a smart construction education center.

## II. RELATED WORKS

There has been a study on Terrain Construction of Unmanned Aerial Vehicle Simulator Based on Spatial Information [3]. It presented a method that models real terrain using 3D high precision spatial information such as aerial images, DEM, Vector data, 3D building modeling data published by public institutions. Terrain surface was classified according to certain scientific standards and the same quality characteristic was color indexed and then applied layer extraction for each terrain characteristic and mask image conversion technique using a land cover map provided vector map form.

From military drones to hobby drones, the interest in drone has increased with drone technology development in public and private fields. Drone simulator’s interest also has increased for drone education trainings. Training by drone simulators has advantages that are drone safety accident prevention by inexperienced pilot and reduction of human and property damage. There are some commercial drone simulators such as Flight simulator [4] which has developed from traditional flight simulators and UAV, DJI FLIGHT SIMULATOR [5] that renders drone flight experience of manufacturer drones etc. These simulators provide a wide range of flight environments and drone bodies. However, rather than reflecting the topography of reality, the simulators just can acquire drone control skills in virtual space. Studies have been making and proceeding from drone national certificate test simulators [6] like basic drone control training, to special purpose piloting contents for certification tests through drone prevalence and the introduction of the drone pilot qualification system adding game factors [7]. Due to increased drone utilization, furthermore, special purpose simulators have been developing to understand and master work skills, as well as analysis work efficacy for each work field. For example, Nozzle flow characteristics and simulation of pesticide spraying drone [8] has studied in the farming industry. Moreover, there is a study on practicality of the drone at disaster area [9]. Drone simulator can use across the whole industry.

Smart construction is technology for productivity innovation based on information sharing and connectivity in all steps of construction such as design, construction, and maintenance using the industry 4.0 technology such as robots, drone, and big data. Smart construction industry has been proceeding and demonstrating based on infrastructure and transport’s smart construction technology roadmap [10] as our government policy projects. Drone application and the simulator introduction have been progressing in construction sites and road at private enterprise and public offices [11-12].

## III. SMART CONSTRUCTION DRONE SIMULATOR

This study is delineated the process of making real-world images using high-resolution image information and the techniques that implement the drone simulator using flight characteristics.

3.1. Construction of Topographic Images Based on High- Precision 3D Spatial Information

Currently, high resolution orthophotos, DEM, and high precision 3D building modeling are published through public data due to the development of close-range aerial photographs and survey equipment. It is an efficient way to make real terrain modeling to improve realism in drone simulators. Here, aerial photographs, DEM, land cover maps and 3D models are to be used for rendering real-world images Table 1.

Table 1. High-precision 3D spatial information for topographic image construction.
Item Aerial photograph DEM Land cover map 3D model
Image
Resolution 0.25 cm 5 m 1:5,000 N/A
Format geotiff geotiff shp fbx
Coordinate system utm-k utm-k Korea
2000/central belt
2010
Korea
2000/central belt
2010

The original purpose of this spatial information is differrent and composed of different coordinate systems, resolutions, and file formats, so we implemented a tool that can covert the topography scope that we need to build (Fig. 1). The tool creates a map applied real-world-images processing coordinate transformation, resolution transformation, region extraction, and mask image creation.

Fig. 1. Implementation of a spatial information conversion tool.

The main functions of the spatial information conversion tool are as follows.

Acquired spatial information, like aerial photographs, DEM and Korea Digital map, imported and then converted to UTM coordinates to use in real-world-images. The original DEM pixel resolution creates 5 M but it creates 1 M or more in real-world-images when forming a height map. It applies interpolation algorithm when the resolution changes. The boundaries of terrain are extracted from aerial photographs, DEM, and land cover maps for the region extraction to use in real-world-images. A mask image is created to reflect real-video effects such as water flowing, swaying grass, and trees in the wind from classified and extracted rivers/streams, bushes, mountains etc by using land cover maps (Fig. 2).

Fig. 2. realistic video production using spatial information conversion tools.

The 3D model meta-file consists of model arrangement information files and 3D model files. The model layout information file is a file in json format and consists of attribute information such as the center coordinates, building classification, names of buildings and so on. And the 3D models are fbx files. In order to arrange the 3D models in the simulator topography, the center coordinates of the information of the model layout information file are converted into coordinate systems suitable for the simulator. And each model file name and layout information are generated as one meta file. It is arranged on the terrain as a 3D model file using the meta-file information generated above using the model arrangement tool (Fig. 3).

Fig. 3. Create meta files and place 3D models.
3.2. Modeling of Flight Dynamic Characteristics

This simulator applies scalable structure like Fig. 4 for reality [13-14] A vehicle class is defined as having numbers of actuators which generate force and torque. This class includes parameter such as mass, inertia, linear and angular drag coefficient used by physics engine to calculate drone kinetics. Drone motion states express physics engine class which are 6 quantities: Location, direction, linear velocity, linear acceleration, angular velocity, and angular acceleration. The physics engine goal is to consider the force and torque for each motor and then calculate kinematic state.

Fig. 4. Flight dynamic characteristic modeling.

The drone’s frame modeling parameters characteristics such as the number of propellers, frame size, and weight and transmits to the physical engine. The drone motion is expressed by calculating force and torque of each number of propellers. The example of the quadrotor drone body, bottom left, expresses that the 4 blue dots are force F and u that is the result of torque τ (Fig. 5). Control input ui controls rotational speed of the propellers located at the 4 dots. The force and torque by the propellers calculate below the equation (1).

Fig. 5. Quad-rotor drone airframe modeling and dynamics modeling.
(1)

Here, CT and Cpow are thrust and power coefficients, respectively, and are based on the physical properties of the propeller. ρ is the air density, D is the diameter of the propeller, and ωmax is the maximum angular velocity of the number of rotations per minute. Vertical take-off and landing (VTOL) simulations are possible by allowing the movement of these vertices during flight.

3.3. Weather Environment Modeling

It is different from visualized weather effects in real-world-images in that we simulated flight being affected by environmental factors such as wind, gravity and so on through physic modeling [14]. We apply the binomial theorem to Newton’s law of gravity and approximate the gravitational acceleration g at height h, where Re is the radius of the earth and g0 is the gravitational constant measured on the surface.

$g={g}_{0}\cdot \frac{{R}_{e}^{2}}{{\left({R}_{e}+h\right)}^{2}}\approx {g}_{0}\cdot \left(1-2\frac{h}{{R}_{e}}\right).$
(2)

We simulate more realistic drone flight characteristics applying the wind physical properties at the drone motion states because drones are strongly influenced by wind during the drones are flying in reality [15].

(3)

In equation (3), A is the surface area (m2), pd is the dynamic pressure (pa), and air density (kg/m3), and v is the wind speed (m/s). When moving air and wind are stopped by the surface, the dynamic energy of the wind is converted to pressure, and the pressure acting on the surface is changed to force.

Fig. 6. simulation of the physical characteristics of wind.
Fig. 7. Camera control and overlay function.
Fig. 8. Camera (EO/IR) image simulation.
Fig. 9. Camera image output function.
3.4. Sensor Simulation

It is a trend that drones are equipped with sensors for flight navigation such as barometers, GPS, and gyros, distance sensors for autonomous flight, and vision sensors. And the characteristics of sensors are differentiated according to price and function. The sensors simulate its location, position, and characteristics (Table 2).

Table 2. Sensor simulation.
Sensor Simulation Example
GPS Vehicle location
IMU Vehicle posture
Barometer Altitude
Magnetometer Azimuth
LiDAR Distance
Camera Image

The smart construction drone simulator uses a camera sensor for regular photography and requires gimbal control and zoom magnification control of the camera sensor during drone flight. Therefore, the simulator provides control such as camera viewing angle, zoom magnification, etc., and simulation of camera overlaid cities and Electro optical (EO) / Infrared images (IR).

A specialized smart construction simulator can be used in topographic 3D model at the construction design step, in 3D condition survey, ready-made supplies estimates at the construction step and in inspection diagnosis, 3D CAD at the maintenance step. In order to do that, the images, output by the sensors, have to formulate file forms. This simulator has a function that outputs image files from changing video size, filming cycle etc by the camera image storage options.

## IV. TRAINING CONTENTS USING SMART CONSTRUCTION DRONE SIMULATOR

4.1. Drone Safety Inspection Training Contents

Drone utilization has been getting increased in the construction field applied smart technology. The construction design stage needs dronography technique education training to make 3D topographic model and foundation. The construction step needs education trainings to acquire the working situation, obstructions, and 3D digital mapping data for automation plans such as work methods and connection between construction equipment etc. The maintenance step needs close flights and photograph education training to plan for dronography and analyze the construction facility videos. This research implemented drone safety inspection training contents through the simulator, mentioned chapter 3, and safety inspection facility training scenarios using drones for training smart construction expects by Korea Expressway Corporation (EX).

EX checks expressway facilities safety such as bridges and slopes using dronography. The bridges need close flights to check the damages for bridge expansion joints, bridge-decks surfacing, barriers and drainage facilities etc within 8 M and for the piers, abutments, girder, and bridge bearing parts within 3 M. The slopes such as shotcrete, banquette drains, anchor reinforcement facility, reinforced earth retaining walls, stepped reinforced earth retaining walls, need close flight to check the damages such as cracks, frost heave, joints, faults within 3 M. The close dronography education training demands to get effective images from safety accidents and getting damage of the drone and its gears. We modeled Geumgok Bridge (GB) training center for drone inspection safety training. GB is the place that was held Smart construction challenge 2020 – Smart maintenance competition about real-time survey and drone utilization defect detection by EX [16] and located at Yongsan-myeon, Yeongdong-gun, Chungcheongbuk-do on Gyeongbu Expressway.

Fig. 10. Geumgok Bridge, a drone safety inspection training center located on Gyeongbu Expressway.
Fig. 11. Construction of terrain model based on spatial information.

We modeled GB acquired 3D spatial information such as aerial images, DEM, land cover map, etc. And we implemented the terrain where is within a 2 km radius of the GB as a background, using the spatial information conversion tool. We made the detailed 3D model of the GB by high resolution images from dronography. And 3D terrain mesh was created from the training area (4 km×4 km) by the resolution transformation and region extraction. The area was mapped into topographic textures by quartering the 50 cm/pixel in spatial resolution, aerial photograph. Mask images were extracted by being classified mountains, rivers, fields/rice paddies into layers from a vector data (land cover maps). We implemented the background using trees, bushes, water 3D models and effects in the mask imaged area to provide reality.

We rendered the detailed GB model by dronography for the realistic drone inspection safety training. It was shot with average ground sampling distance (AGSD) 3 cm, WGS 84 coordinate from setting altitude 140 m (Fig. 12).

Fig. 12. Detailed 3D modeling of The GB by Dronography.

There are 2 types of the drone inspection safety trainings: one is the close flight for each bride part such as pier, copping, guardrail, and floorboard. The other one is contents for acquiring dronography and mastering drone control skills. There are some ways for inspection safety facility flights that are left-right grid, up-down grid, double grid, and circular and, guidelines on the screen for each the bridge part.

Table 3. Flight method and photographing rules for each bridge member.
Part Model example Start position End position The mode of flight Shooting interval
Pier Ground The top of a bridge Circular 2 to 3 seconds apart at flight speed 3 m/s (shooting rules: 80% overlap between each picture)
Copping Front left Right side of the back Double grid (up-down, left-right)
Guardrail Left guardrail Right guardrail Grid (left-right)
Floorboard Upper left Lower right Grid (up-down)

To make 3D models by dronography, some rules have to be followed that are keeping the facilities distance, close flight and 80% of overlaid shooting for the facility safety analysis. We indicated the distance of the facilities, flight speed, and shooting conditions on the screen to check the close flight and shooting rules. The flight speed should be kept being steady by micro control of a controller for the 80% overlaid shooting. The controller was professionally tuned to make the pilot feel like the actual drone. And it was applied in the flight dynamics model at the drone simulator.

4.2. Demonstration of the Training Content Education Use

The drone simulator educational use conducted a survey that demonstrated the training contents usage for EX trainees and trainers at the smart construction education center. The survey included drone control experiences, the education satisfaction measurement, and improvement requests. 68% of the trainees have controlled drones in the real world who participated the inspection safety facility education at the smart construction maintenance step. And 62% of that trainees have skilled over 6 months. Among of the 37% of trainees have used drone simulators to master drone control skills.

The survey included 5 types of questions such as the contents quality like the simulator image, the pilot feel, etc and the education satisfaction measurement. The results are as the follows:

78% of the people were satisfied with the terrain and facilities description level on the simulator. 84% of the people answered it is helpful to improve work understanding. On acquiring the drone skills, 73% of the people replied it is helpful the usability. Even the non-experienced people showed 84% of satisfaction of the efficacy. There were personal differences on a sense of reality for the drone controller that 35% of the people answered ‘usually’, and 12% of the people answered ‘totally different’.

Fig. 13. The survey on drone pilot experience.

## V. CONCLUSION

Existing commercial drone simulators were intended to acquire control skills. But now days, smart construction drone simulators need to develop for cultivating smart drone expects and acquiring the drone control skills for the construction field. We present a simulator making real-world-images based on high-precision spatial information and applied the quad rotor drone dynamics, camera sensors, gravity, wind, and so on for reality of construction sites. And training contents implemented to cultivate drone experts on the inspection safety facilities. In the contents, the drone shots close to the facilities so micro control education is needed to check GPS shaded area such as under bridges. For that training scenarios were constructed in order to get closed-flight and dronography skills for the GB such as pier, guardrail, copping and floorboard.

This research presents the drone simulator, based on high precision spatial information. It describes application field of drone such as firefighting, rescue, disaster, search for missing people and agriculture etc more realistically. Thus, this research proposes virtual environment development to focus on training scenarios development for each field.

Drone simulators provide weather effects such as rain and snow for visual factor, but the effects do not influence on drone flight dynamics modeling. Drone flight is affected by wind, so this research simulated drone flight characteristic adding the speed and direction of wind. However, it will be difficult to predict and model wind effects on drones flying between buildings and piers.

Henceforth, a study will be needed to simulate wind effects at specific sites. For example, the wind can be different from a global wind, between piers and under the floorboard. Also, the wind effect can be changed by cars on the bridge. A training opportunity will be able to provide to cope with the risk of drone accidents in advance, if subdivide a bridge site and simulate wind effects on the situation.

Table 4. The education satisfaction survey results.
Questions Results of response by questions
Very unsatisfied Unsatisfied Neutral Satisfied Very satisfied Total
Satisfaction with terrain and facility description level 0 (0%) 0 (0%) 4 (21.1%) 10 (52.6%) 5 (26.3%) 19 (100%)
Contribution to improving work understanding 0 (0%) 0 (0%) 3 (15.8%) 12 (63.1%) 4 (21.1%) 19 (100%)
Usability of acquiring drone skills 0 (0%) 0 (0%) 5 (26.3%) 10 (52.6%) 4 (21.1%) 19 (100%)
Efficacy for beginner education 0 (0%) 0 (0%) 3 (15.8%) 9 (47.3%) 7 (36.8%) 19 (100%)
A sense of reality for the drone controller 0 (0%) 2 (11.8%) 6 (35.3%) 5 (29.4%) 4 (23.5%) 17* (100%)
Total 0% 2.2% 22.6% 49.5% 25.8% 100%

* Excluded 2 nonrespondents out of 19 respondents.

We confirmed the improvement requests of the training contests on the survey from the EX trainers and trainees (Table 5). There are some requests that are improvement functions such as the image quality and the drone controller sensibility, the user friendliness, diversification of drones, the flight guideline mark, and adding more slopes and bridges as extra requests.

Table 5. Improvement requests.
Improvement requests Case Remarks (case)
Improvement function 6 - The simulation image quality (2)
- Difference of the controller’s sensitivity from the actual one (4)
Improvement ease of use 3 - Instructions, Korean terms etc. (3)
Request of additional function 11 - Training for diverse drones (2)
- Hovering, ATTI mode, 3rd person mode etc (4)
- Vibration, wind effects (1)
- Collision detection function (1)
- Show different colors for past paths (1)
- Show drone flight subline (1)
- Controller directional guide output for beginners (1)
Request to adding contents 4 - More reality adding cracks on GB (1)
- Check slopes of green area (1)
- Floorboard control practice when GPS is poor (1)
- Drone control tutorial (1)
Others 3 - Map too wide (2)
- Very satisfied (1)
Total 27

Henceforward, we are planning to make training contents for dronography flight technique for 3D topography-based model creation in construction design step and monitoring construction site in construction step at smart construction. As the 3D modeling progressed by drone shooting, research and development will be studying to provide simulator environment connected to GCS software for dronography flight by auto flight path planning and efficient rendering problem for real terrain using high-resolution, high-capacity topography 3D model data.

## ACKNOWLEDGMENT

This work is supported by the Korea Agency for Infrastructure Technology Advancement (KAIA) grant funded by the Ministry of Land, Infrastructure and Transport (Grant 22DRMS-C147026-05)

## REFERENCES

[1].

Y. Lee, “Development of unmanned aircraft in the fourth Industrial revolution,” Convergence Security Journal, vol. 18, no. 5, pp. 3-10, 2018.

[2].

J. Chong, S. Lee, S. Shin, S. Hwang, Y. Lee, and J. Kim, et al., “Research on the meteorological technology development using drones in the fourth industrial revolution,” Journal the Korea Contents Association, vol. 19, no. 11, pp. 12-21, 2019.

[3].

S. Park, G. Hong, J. Won, and Y. Heo, “A study on terrain construction of unmanned aerial vehicle simulator based on spatial information,” Journal of Korea Multimedia Society, vol. 22, no. 9, pp. 1122-1131, Sep. 2019.

[4].

Flight Simulator, 2022. https://www.flightsimulator.com/.

[5].

DJI Flight Simulator, 2022. https://www.dji.com/kr/simulator.

[6].

[7].

M. Park, C. Yun, and T. Woo, “Developing a virtual drone flight training contents for beginner,” Journal of Korea Game Society, vol. 19, no. 5, pp. 53-62, 2019.

[8].

K. Kang, S. Chang, I. Ra, S. Kim, and H. Kim, “Nozzle flow characteristics and simulation of pesticide spraying drone,” Smart Media Journal, vol. 8, no. 4, pp. 38-45, 2019.

[9].

Y. Yang and D. Kim, “Study on practicality of the drone at disaster area,” in Korea Disaster Information Society Regular Academic Conference and Special Seminar, 2018, pp. 69-70.

[10].

MOLIT, 2018. Infrastructure and Transport's Smart Construction Technology Roadmap, http://www.molit.go.kr/USR/NEWS/m_71/dtl.jsp?lcmspage=2&id=95081506.

[11].

M. Song, “A study on the utilization of drone in construction site for smart highway implementation,” in Korean Society of Telecommunications Conference, 2021, pp. 1174-1174.

[12].

K. Choi and K. Chong, “Development of prototype simulator for road and traffic information monitoring drone operation,” Journal of the Korea Academia-Industrial Cooperation Society, vol. 23, no. 5, pp. 206-216, 2022.

[13].

Unreal Engine, 2022. https://www.unrealengine.com/ko.

[14].

S. Shah, D. Dey, and A. Kapoor, “Airsim: High-fidelity visual and physical simulation for autonomous vehicles,” in Field and Service Robotics. Cham: Springer, pp. 621-635, 2018.

[15].

## AUTHORS

Jin-Hee Won

received his BS in the Department of Software Engineering from SangMyung University, Korea, in 2004. In 2015, he joined the Technology Research Institute of D2 Innovation Co., Ltd.

His research interests include Geographic Information System, Real-time rendering and Drone simulator.

Gi-Ho Hong

received his BS and MS degrees in the Department of Computer Engineering from KangWon University, Korea, in 2000 and 2002, respectively. In 2009, he joined the Technology Research Institute of D2 Innovation Co., Ltd.

His research interests include Geographic Information System, Real-time rendering and Drone simulator.