Editor's note: This post was submitted by Brian Wee, NEON as a guest post from a Type I ESIP. Thanks Brian! NEON has a blog too.
A fair number of us have spent at least parts of our professional lives fretting over data representation. How do we accurately capture real-world processes in a way that captures the relationships between entities and manage the data such that it is amendable for processing? Can we sufficiently represent the state of ‘reality’ with sufficient fidelity so that we can resuscitate a representation of that reality on demand? How do we enable technical interoperability to facilitate data discovery and retrieval, semantic interoperability to facilitate data integration, plus all the other forms of interoperability that one may choose to define?
In the sci-fi movie “Another Earth” that was released at the 2011 Sundance Film Festival to high acclaim, a duplicate earth appears in our skies, apparently populated by human duplicates of this earth. If not for the moral implications (even if there were no humans!), an earth replica would be excellent for conducting large-scale experiments to answer questions about climate mitigation, adaptation, and geo-engineering. In the absence of such a capability to create Earth 2.0, we are nevertheless on a long-term trajectory towards observing and virtualizing the environment at different geospatial scales: from the soil microbial community, to intensely measured sites, to high-resolution landscape-scale measurements, to global satellite-based observations. The NSF NEON infrastructure, currently under construction, encompasses observations at microbial to landscape scales (with over 500 primary measurements acquired at each of 60 sites), using satellite-based observations for the interpolation of variables of interest across the US continent.
Reality has been happily humming along just fine for eons, the difficulty lies in modeling it at the relevant scales of time and space. This is especially challenging when developing models to integrate natural processes operating at various temporal and spatial scales. When we measure and capture environmental data, are we doing it at the relevant temporal and spatial resolution? How do we formulate archival policies that do not inadvertently eliminate data that may at first seem irrelevant but that we discover to be useful later? The business and military intelligence community invests resources to deal with unstructured data because reality is inherently messy and there is a great need to obtain analyses and assessments in a timely fashion: to what extent are these approaches applicable to observational and experimental environmental data?
This community is, knowingly or not, involved in the business of virtualizing reality. In the Tron universe, “Users” were digitized into a virtual world by a laser (my favorite Tron object: the troop carrier!). This community is collectively building an equivalent instrument that comprises physical infrastructure, cyberinfrastructure, standards, applications, tools, and best practices to digitize slices of the “reality cube” (with reference to spectral data cubes and the proposed NSF Earth Cube) into a form that can be manipulated for scientific understanding and forecasting. At the end of the day I would really like to see us build something (with a benign MCP please... we know what happened in the Tron universe...) that will give us a virtual troop carrier for cruising around on Earth 2.0!