Find your people at the ESIP Meeting: esipfed.org/meetings
Drone Update
Jane Wyngaard, Fox Peterson, Lindsay Barbieri (Bar)
After winning the ESIP Funding Friday, we’ve been off exploring and researching a range of drone systems and sensor possibilities. With all three of us embedded in a multitude of other projects, development has started slowly and cautiously, but with funding and ESIP winter meeting on the horizon – we’ll be using the collaborative opportunity afforded by the American Geophysical Union (AGU) winter meeting this December to put some exciting development in place (look out and come talk to us if you’re there!).
Testing Methods for Monitoring Agricultural Field Activity:
Thanks to the partnership of University of Vermont’s Spatial Analysis Lab, we’ve been able to do some experimentation with the commercial and very slick eBee senseFly fixed wing UAS (unmanned aerial system). This professional mapping UAS provides high resolution DEM and infrared imagery. For now we are just pairing this imagery with ground based Photoacoustic Gas Monitors that provide weekly samples from local corn and hay fields. But, as our ultimate goal is a much more robust, open-source and cost-effective system, we are marching on towards being able to use UASs for large scale gas emissions monitoring…
Our First Test Drone:
For our first integrated drone-sensor-software tests we’ve purchased an off-the-shelf Iris+ from 3DR. Targeted at the hobbyist market the Iris+ is far from what we’d likely deploy for actual experimentation (specifically it’s underpowered and has rather a limited flight time) but, it is relatively cheap and will serve us very well in our goals of exploiting software flexibility to provide a generic earth scientists UAS solution for a number of reasons.
Firstly, being a commercial product this platform constrains us to operating within the limits an earth scientists – with no engineering team backing – might have. Secondly, in comparison to the complete commercial solutions, 3DR crucially has its origins in the open source community. This makes the IRIS+ the closest you can get to a fully open source environment within a ready-to-fly infrastructure (rather than a custom DIY solution that would require said engineers). With their open autopilot (hardware and software) and DroneKit API, we are able to relatively easily hack the Iris+ system for our own purposes – integrating our own software stack (SEDD – Standardised Embedded Data for Drones) and our custom test sensors. Finally, as further consequence of their open origins, while the Iris+ is a complete ecosystem, it’s components are separately available giving us the freedom to also create our own custom DIY system for greater experimentation.
Greenhouse Gas Sensor Selection:
With this testbed UAS in hand, the harder problem is that of sensor selection. Obviously this is entirely use case dependent. So while as a cluster we are targeting a generic drone for earth science software stack, our demonstration use case is specific – we’re looking at agricultural gas emission monitoring, with a view to facilitating climate mitigation strategy evaluation. Focusing on rice farming particularly our target gasses are NO2, CH4, CO, and CO2, respectively during the growing and burning seasons.
While highly reliable and accurate gas chamber sensors afford valuable single location measurement, the temporal lag on data, poor area coverage, and high cost in researcher time make this approach limited. UASs potentially offer a ‘birds-eye’ view of a field’s emissions for a much more rapid repeat sample turn around time and lowered cost. Depending on the sensor used, however, this could come at an accuracy cost, although part of this work will examine the differences in accuracy between inferring emissions from high precision single point measurements and taking many lower precision samples.
If accrued, the accuracy cost will be due to the limitations of the sensor selected. In using a UAS there are clearly sensor restrictions, such as weight, power, environment, and sample time, and there is as always a monetary cost limitation too. For now we are approaching the problem with a plan to evaluate a small range of sensors. Our range will include both direct sample sensors, flown on a multi rotor UAS in multiple vertical profiles or at varying specific altitudes, and a remote sensing approaches flown on fixed wing UASs at sufficient altitude to capture large regions instantaneously.
Data Management (Once We Have Our Sensor):
Following the collection of our first test run's data, we'll be using widely available and scaleable open source tools to evaluate and visualize our results. Our initial goal is to gather data and process it in a framework that can ultimately be deployed on a large-scale, near-real-time service, but can also be used by the independent scientist or researcher wanting to gather some lower-cost data.
Our UAS is equipped with on board GPS, which enables us to collect it's location in three dimensions. For our initial test, we'll deploy the UAS on a known test site and gather streaming data to store in javascript-object notation (JSON) format on a cloud server. We plan to collect not only location data, but also the sensor readings around ground-truthed points.
We will then parse this data using the leaflet.js framework into a GeoJSON object and layer it on top of a custom constructed Mapbox template. The flight path of the UAS can be animated in a downscaled version of real time and ground emissions visualized accordingly. Using turf.js, we can construct spatial interpolations for the hypothetical dissolution of our experimental emissions, and compare the values read by the on-UAS sensors to the expected values within given distances of the ground points. As we animate the UAS through its flight path concurrently with the suspected emission concentrations, we can detect and assess how the UAS readings compare to the “truth” both spatially and temporally. This will form the baseline for our UAS performance assessment. If sensors behave within reasonable expectations (which are to be determined after this initial assessment), a next step is to use the literature to set thresholds for “too high” concentrations of gases, and attempt to detect natural emissions.
It is likely that ground emissions are also associated with environmental factors, namely temperature and windspeed. From this framework, we can also associate our JSON data with meteorological observations and create data visualizations using the D3 library that combine robust statistical models with highly supported geographical libraries.
Best of all, this entire data-visualization framework (apart from the data storage and transfer) is completely free and open-source to build and work in, and requires only the knowledge of CSS,SVG, and Javascript, both of which are very common web-application programming languages. Furthermore, products created in this way could be served either as static pages for reports, or the javascript functions could be run on a web-socket, such that streaming data was (almost) immediately converted into JSON and rendered onto the map in near real time, with concurrent graphics.
Conclusion (and come find us at AGU!):
We're excited to be prototyping a data driven future for UASs that will create, both physically and virtually, a performant and vibrant environment. If you’ll be at AGU and are interested in the above of ESIP Drone Cluster work in general please come find us!