Display and analysis of geospatial information for UAS, autonomous BVLOS flights and camera path
Having an up-to-date operational picture is essential to efficiently and safety be able plan and manage UAS Missions in real-time. This can be achieved using a combination of different types of geospatial information, sensor data and dynamic events. The data can then furthermore be used for a number of different run-time analysis like line-of-sight, projected video and generation of UAV routes.
Using a high detail 3D model of a city environment, a system for urban 3D flight planning is able to operate multiple UAVs BVLOS, integrated with UTM system to safely operate in high-intensity airspace in close proximity to surface/buildings and deconflicted from other UAVs and flight corridors. UAS routes need to be autonomous calculated based on mission parameters and inputs from UAS platform and UTM such as, safe flight zones, weather data, sensors coverage, restricted airspaces and deconfliction with other UAS flight corridors.
Since the system should be mission centered, the concept of a camera path is a central element. Traditionally the user starts by defining waypoints that builds up a flight path. Here we instead generate a camera path based on the desired mission objectives and the field of view of the camera. The user should be thinking primarily from the perspective of what the camera “can see” (field of view) and secondly about where the drone will be physically located (flight path). From this an optimal route and camera movements and angles are calculated.