

Objective
Businesses must develop and validate an automatic Ground Control Point methodology for orthomosaic, while providing advancements in object identification and mapping (KAZE algorithms2) for multiscale feature detection.
Description
Small Unmanned Aerial Systems platforms have redefined squad-level ISR collection processes. This provides an overmatch capability, aiding in Soldier lethality and maneuverability for both dismounted and mounted off-road mobility platforms. In addition to the challenges of operating small UAS in Arctic environmental conditions, the post processing of imagery for orthomosaic and DSMs (photogrammetric surface modelling derived from Structure from Motion) is complex. This is due to the poor availability of identifiable ground features and contrast from a highly reflective surface.
Traditional photogrammetry methods rely on GCPs and distinct terrain features to align and process imagery. However, in Arctic environments characterized by flat, snow-covered terrain, it lacks these critical features. This impacts the accuracy of traditional methods, often making them incalculable.
Recent computer vision and remote sensing advancements in object identification and mapping (KAZE algorithms2) for multiscale feature detection in nonlinear scale space will enhance photogrammetric accuracies, as well as provide a basis to derive feature-matching algorithms for localization in the absence of GNSS.
This topic will further define photogrammetric processes unique to Polar Environments from small UAS imagery collections. Those processes will result in automated, near real-time product generation that will aid in ground maneuverability, UAS maneuverability (obstacle avoidance), and visual terrain referencing for operations in denied GNSS environments.
Phase I
Businesses should integrate and advance the photogrammetry process of automatically defining and matching ground control points (pre-bundle adjustment) for accurate terrain modelling and imagery creation. Existing Auto-GCP algorithms will integrate into the photogrammetric process to collect imagery and assess model performance on commercial hardware.
The Army will assess identified ground control points for accuracies and inclusion in Visual Terrain Referencing for future resection algorithms focused on localization in GNSS denied environments. The vendor will produce a detailed site survey with known ground control point horizontal and vertical accuracies, comparing identified control points and benchmarked control points.
Vendors must determine the atmospheric conditions in an Arctic environment that will support the desired final products at an absolute accuracy and minimum resolution of 10 centimeters, with absolute geolocation accuracy of <5.0 m CE90/LE90 and vertical accuracy of <10 meters. An initial summary of results should include existing and derived results of weather affects on sUAS operations in Arctic environments.
The business will establish an approach from surrogate, derived SUAS imagery or Full Motion Video for the efficacy of advanced high resolution terrain models and photogrammetric processes, suitable for tactical-level integration in military applications.
Phase II
The vendor will develop a near, real-time computational process either on-board or via a direct down link to an End User Device. It should generate an orthomosaic of a pre-defined area and photogrammetrically derived DSM at a minimum resolution of 5 centimeters, with absolute geolocation accuracy of <2.0 m CE90/LE90 (matching Arctic DEM products) and vertical accuracy of <5 meters1.
These 3D models will incorporate existing structures from motion and computer vision techniques employed by commercial and Army systems. These will derive ultra-high resolution 3D models. In addition to these products, the near real-time processing will need to identify hazardous features in the environment that could imapct UAS operations and flight mission planning.
The Auto-GCP algorithms will fully integrate into the photogrammetric process and run in real-time during the collection phase. Existing algorithms will incorporate into the photogrammetric processing, which includes recent advancements in feature detection, alignment and multiple scales (KAZE features9) available from existing automated geo-regristration techniques.
Phase III
This research will not only pave the way for accurate high-resolution mapping at the squad level in featureless terrain, but also provide methodologies for observing the rapidly-changing Arctic environment. This application would aid in climate change studies and environmental monitoring, while assisting in ground mobility (mounted and dismounted), low altitude SUAS maneuvers, and flight operations where GNSS is limited or non-existent.
Commercial SUAS offerings in Arctic environments are limited due to the inability to operate above 60◦ latitude. This requires a high-resolution Digital Elevation Model to launch and recover for terrain tracking. Commercial applications of this product will benefit from utilizing new data sets (Arctic DEM Project), as well as advanced elevation models for flight planning and operations.
This will establish a near real-time localization algorithm and process for sUAS localization in the absence of GNSS post-initialization. A vision-based navigation or visual terrain referencing software system, encompassing the Phase II Auto-GCP software, will organically collected imagery and photogrammetrically derived DSM’s for feature or horizon matching to determine the aircrafts position. The resulting localization will require an absolute position that is sufficient to carry out flight operations for a minimum of 50% of the entire flight time.
A real-time object identification and avoidance model will aid in low altitude collections and reconnaissance missions derived from on-board optical camera systems. This effort will also aid in vision and terrain-based navigation, with ground units for determining SUAS position and ground force localization during denied and degraded GNSS events.
The culmination of phase III will integrate the Android Team Awareness Kit platform to render the resulting orthomosaics and DSM locally on devices within 30 minutes of post flight operations. Businesses will deliver the final products in a data format that is already supported within the ATAK software suite and is properly aligned in a supported geographic data model (WGS 84 – Web Mercator).
Submission Information
All eligible businesses must submit proposals by noon ET.
To view full solicitation details, click here.
For more information, and to submit your full proposal package, visit the DSIP Portal.
STTR Help Desk: usarmy.rtp.devcom-arl.mbx.sttr-pmo@army.mil
References:
Objective
Businesses must develop and validate an automatic Ground Control Point methodology for orthomosaic, while providing advancements in object identification and mapping (KAZE algorithms2) for multiscale feature detection.
Description
Small Unmanned Aerial Systems platforms have redefined squad-level ISR collection processes. This provides an overmatch capability, aiding in Soldier lethality and maneuverability for both dismounted and mounted off-road mobility platforms. In addition to the challenges of operating small UAS in Arctic environmental conditions, the post processing of imagery for orthomosaic and DSMs (photogrammetric surface modelling derived from Structure from Motion) is complex. This is due to the poor availability of identifiable ground features and contrast from a highly reflective surface.
Traditional photogrammetry methods rely on GCPs and distinct terrain features to align and process imagery. However, in Arctic environments characterized by flat, snow-covered terrain, it lacks these critical features. This impacts the accuracy of traditional methods, often making them incalculable.
Recent computer vision and remote sensing advancements in object identification and mapping (KAZE algorithms2) for multiscale feature detection in nonlinear scale space will enhance photogrammetric accuracies, as well as provide a basis to derive feature-matching algorithms for localization in the absence of GNSS.
This topic will further define photogrammetric processes unique to Polar Environments from small UAS imagery collections. Those processes will result in automated, near real-time product generation that will aid in ground maneuverability, UAS maneuverability (obstacle avoidance), and visual terrain referencing for operations in denied GNSS environments.
Phase I
Businesses should integrate and advance the photogrammetry process of automatically defining and matching ground control points (pre-bundle adjustment) for accurate terrain modelling and imagery creation. Existing Auto-GCP algorithms will integrate into the photogrammetric process to collect imagery and assess model performance on commercial hardware.
The Army will assess identified ground control points for accuracies and inclusion in Visual Terrain Referencing for future resection algorithms focused on localization in GNSS denied environments. The vendor will produce a detailed site survey with known ground control point horizontal and vertical accuracies, comparing identified control points and benchmarked control points.
Vendors must determine the atmospheric conditions in an Arctic environment that will support the desired final products at an absolute accuracy and minimum resolution of 10 centimeters, with absolute geolocation accuracy of <5.0 m CE90/LE90 and vertical accuracy of <10 meters. An initial summary of results should include existing and derived results of weather affects on sUAS operations in Arctic environments.
The business will establish an approach from surrogate, derived SUAS imagery or Full Motion Video for the efficacy of advanced high resolution terrain models and photogrammetric processes, suitable for tactical-level integration in military applications.
Phase II
The vendor will develop a near, real-time computational process either on-board or via a direct down link to an End User Device. It should generate an orthomosaic of a pre-defined area and photogrammetrically derived DSM at a minimum resolution of 5 centimeters, with absolute geolocation accuracy of <2.0 m CE90/LE90 (matching Arctic DEM products) and vertical accuracy of <5 meters1.
These 3D models will incorporate existing structures from motion and computer vision techniques employed by commercial and Army systems. These will derive ultra-high resolution 3D models. In addition to these products, the near real-time processing will need to identify hazardous features in the environment that could imapct UAS operations and flight mission planning.
The Auto-GCP algorithms will fully integrate into the photogrammetric process and run in real-time during the collection phase. Existing algorithms will incorporate into the photogrammetric processing, which includes recent advancements in feature detection, alignment and multiple scales (KAZE features9) available from existing automated geo-regristration techniques.
Phase III
This research will not only pave the way for accurate high-resolution mapping at the squad level in featureless terrain, but also provide methodologies for observing the rapidly-changing Arctic environment. This application would aid in climate change studies and environmental monitoring, while assisting in ground mobility (mounted and dismounted), low altitude SUAS maneuvers, and flight operations where GNSS is limited or non-existent.
Commercial SUAS offerings in Arctic environments are limited due to the inability to operate above 60◦ latitude. This requires a high-resolution Digital Elevation Model to launch and recover for terrain tracking. Commercial applications of this product will benefit from utilizing new data sets (Arctic DEM Project), as well as advanced elevation models for flight planning and operations.
This will establish a near real-time localization algorithm and process for sUAS localization in the absence of GNSS post-initialization. A vision-based navigation or visual terrain referencing software system, encompassing the Phase II Auto-GCP software, will organically collected imagery and photogrammetrically derived DSM’s for feature or horizon matching to determine the aircrafts position. The resulting localization will require an absolute position that is sufficient to carry out flight operations for a minimum of 50% of the entire flight time.
A real-time object identification and avoidance model will aid in low altitude collections and reconnaissance missions derived from on-board optical camera systems. This effort will also aid in vision and terrain-based navigation, with ground units for determining SUAS position and ground force localization during denied and degraded GNSS events.
The culmination of phase III will integrate the Android Team Awareness Kit platform to render the resulting orthomosaics and DSM locally on devices within 30 minutes of post flight operations. Businesses will deliver the final products in a data format that is already supported within the ATAK software suite and is properly aligned in a supported geographic data model (WGS 84 – Web Mercator).
Submission Information
All eligible businesses must submit proposals by noon ET.
To view full solicitation details, click here.
For more information, and to submit your full proposal package, visit the DSIP Portal.
STTR Help Desk: usarmy.rtp.devcom-arl.mbx.sttr-pmo@army.mil
References: