Vous pouvez lire le billet sur le blog La Minute pour plus d'informations sur les RSS !
Feeds
10885 items (4 unread) in 53 feeds

-
Décryptagéo, l'information géographique
-
Cybergeo
-
Revue Internationale de Géomatique (RIG)
-
SIGMAG & SIGTV.FR - Un autre regard sur la géomatique (1 unread)
-
Mappemonde
-
Imagerie Géospatiale
-
Toute l’actualité des Geoservices de l'IGN
-
arcOrama, un blog sur les SIG, ceux d ESRI en particulier
-
arcOpole - Actualités du Programme
-
Géoclip, le générateur d'observatoires cartographiques
-
Blog GEOCONCEPT FR

-
Géoblogs (GeoRezo.net)
-
Geotribu
-
Les cafés géographiques
-
UrbaLine (le blog d'Aline sur l'urba, la géomatique, et l'habitat)
-
Séries temporelles (CESBIO)
-
Datafoncier, données pour les territoires (Cerema)
-
Cartes et figures du monde
-
SIGEA: actualités des SIG pour l'enseignement agricole
-
Data and GIS tips
-
Neogeo Technologies
-
ReLucBlog
-
L'Atelier de Cartographie
-
My Geomatic
-
archeomatic (le blog d'un archéologue à l’INRAP)
-
Cartographies numériques (3 unread)
-
Veille cartographie
-
Makina Corpus
-
Oslandia
-
Camptocamp
-
Carnet (neo)cartographique
-
Le blog de Geomatys
-
GEOMATIQUE
-
Geomatick
-
CartONG (actualités)
Open Geospatial Consortium (OGC)
-
15:49
OGC SensorThings API for European Green Deal Data Spaces
sur Open Geospatial Consortium (OGC)A critical component of Data Spaces – those supporting the European Green Deal strategy or beyond – will be their management of dynamic data series coming from many different sources, such as various sensors. To enable interoperable communication between such data series, as well as ease their interpretation and (re-)use, the Open Geospatial Consortium (OGC) long ago developed the SensorThings API Standard. In this post, the Standard is introduced together with the planned developments and initial results from two Horizon Europe projects on Green Deal Data Spaces in which OGC is involved: ‘All Data for Green Deal’ (AD4GD) and ‘Urban Data Space for Green Deal’ (USAGE).
The European Green Deal and digital data strategiesThe European Green Deal aims to transition Europe to a sustainable and resource-efficient economy and governance while improving the well-being of citizens. Several objectives shall contribute to the main goal, including: transforming the economy and societies; making transport sustainable for all; innovating industry; cleaning energy systems; renovating buildings for greener lifestyles; working with nature to preserve the planet; and boosting global climate actions. European Green Deal objectives are, in turn, in line with the United Nations Sustainable Development Goals.
Data, especially dynamic data from sensors, plays an essential role in monitoring, analysing, and understanding the status and evolution of the aforementioned sectors, the influence of proposed policies and decisions, and when simulating and measuring the effectiveness of newly applied measures, models, or changed conditions.
Further, using open protocols and data management practices aligned with the FAIR data principles of Findability, Accessibility, Interoperability, and Reusability, increases efficiency and better supports democracy through data transparency and accessibility.
Digitalisation is recognised as a powerful support to the European Green Deal and Sustainable Development Goals – on the condition that a strategy is followed that ensures efficient and optimised use and management of digital data and associated tools. The European Strategy for Data and the European Data Spaces policies are intended to rule, define, and support such digitalisation towards fair, aware, and efficient data sharing and use.
OGC SensorThings API: an Open model for sensor dataThe OGC SensorThings API was developed by the OGC SensorThings Standard Working Group as an open Standard that enables the web-based management, storage, sharing, and analysis of Internet of Things (IoT)-based sensor observation data.
The OGC SensorThings API consists of a data model and an API split across two documents: (1) the Sensing part and (2) the Tasking part (which is currently in the planning phase). The Sensing part allows IoT devices and applications to create, read, update, and delete IoT data and metadata in a SensorThings service.
The SensorThings entities data model is based on the OGC/ISO Observation and Measurement (O&M) model [OGC 10-004r3 and ISO 19156:2011]. The model can be understood as follows: an “observation” is modelled as a datastream connected to a “sensor” and a “thing” that produces a “result” (aka “ObservedProperty”) whose value is an estimate of a property of the “observation target” (aka “FeatureOfInterest”).
For example: when measuring the temperature in a room, the “Observation” is each measurement or data value at a certain moment in time; the “sensor” is the thermometer; the “thing” is the device that reads and transmits the measurement from the thermometer; and the “result” (ObservedProperty) is the temperature measured by the sensor. The ObservedProperty is associated with the “observation target” (FeatureOfInterest), in this case, the part of the room that the sensor resides in.
An Observation instance is classified by its event time, FeatureOfInterest, ObservedProperty, and the procedure used (often a Sensor). Moreover, “Things” are also modelled in the SensorThings API, and its definition follows the ITU-T definition: “an object of the physical world (physical things) or the information world (virtual things) that is capable of being identified and integrated into communication networks” [ITU-T Y.2060]. The geographical locations of Things are useful in almost every application (in particular in remote sensing, where the object measured is far apart from the actual device) and, as a result, are included as well.
SensorThings API for Green Deal Data Spaces: the USAGE and AD4GD projectsRecent examples of adoption of the SensorThings API standard include two Horizon Europe projects – in which OGC participates as a consortium partner – that intend to provide solutions for the Green Deal Data Space: ‘Urban Data Space for Green Deal’ (USAGE), and All Data for Green Deal (AD4GD).
By using standards for data/service interoperability and integration, USAGE will provide solutions that make city-level environmental & climate data Findable, Accessible, Interoperable, and Reusable (FAIR). This means innovating governance mechanisms, consolidated arrangements, AI-based tools, and data analytics to better share, access, and use the city-level data generated from Earth Observation, Internet of Things, authoritative- and crowd-sources.
USAGE use cases will range from air pollution mobility to mitigation strategies for climate change (such as minimising the impact of urban heat islands and increased floods) as well as smart traffic management. USAGE pilots and use-cases include the cities of Ferrara (Italy), Zaragoza (Spain), Graz (Austria), and Leuven (Belgium). In Ferrara, citizen science initiatives with high schools and volunteers are already scheduled to support USAGE use cases and biodiversity.
AD4GD aims to co-create a Green Deal Data Space with stakeholders and end-users. One of the main objectives of the project is the integration of Citizen Science data together with in-situ observations, results from artificial intelligence models, and remote sensing data. AD4GD will leverage standards and semantic technologies to reach this objective.
AD4GD pilots and use cases include water quality in Berlin’s lakes (Germany), biodiversity in the Metropolitan Area of Barcelona (Spain), and air quality with low-cost sensors in Northern Italy.
The implementation of SensorThings API is a critical part of both projects. Sensors provide essential data for several treated use cases, including for air quality, water quality, traffic status, animal species tracking, weather & environmental parameters, and more.
By using an open and shared protocol to communicate data, such as the SensorThings API Standard, the data required for the projects can come from different types of sensors from different vendors, and the sensors can be managed by the government, private industry, or single citizens. Additionally, SensorThings API’s capacity to link observations to definitions of variables and units of measurement facilitates the automatic aggregation and integration of data while increasing its semantic interoperability.
The two projects plan to further develop the SensorThings API Standard by using it in the provided solutions and recommended architecture while experimenting with its integration with other sensors and data within the developed software. In USAGE specifically, the SensorThings API plugin for QGIS will be refactored to embed access to SensorThings API endpoints in the core part of QGIS and extend it to support data analytics.
Also further implemented and tested in AD4GD and USAGE is SensorThings API Plus. To more effectively manage data produced by multiple actors with different licenses (such as in citizen science data), SensorThings API Plus extends the OGC SensorThings API Part 1: Sensing Version 1.1 (STA) Standard data model and improves the security aspects of the standard. Its development was started in the COS4CLOUD project and was further developed in citiobs.
Promoting the SensorThings API with stakeholdersBoth AG4GD and USAGE additionally recognise that a fundamental challenge facing the uptake of effective solutions is firstly alerting the stakeholders and end users of its existence, and then training them in its use. Therefore, within the USAGE project, some workshops were organised by OGC and project partner DedaNext.
Figure 1 – Responses to the question “What obstacles most hinder the SensorThings API uptake in your organisation?”
The first workshop was organised on February 1, 2023, in Ferrara (and online). The workshop opened with an introduction to the SensorThings API Standard along with several examples of Open Source implementations. Marco Minghini, from the Joint Research Centre group working on the INSPIRE Directive framework and implementation, presented the INSPIRE Good Practice recommendation of OGC SensorThings API. The API4INSPIRE project was also presented, which assessed the implications of using APIs to exchange data, including the SensorThings API. A recording of this initial part of the workshop is available on the OGC YouTube Channel.
Next up was a round table where stakeholders from national and regional Italian bodies discussed current solutions for sharing dynamic data as well as challenges facing the adoption of SensorThings API for improving data management quality and FAIRness. The audience was prompted to answer two questions through interactive questionnaire tools: 1) What obstacles most hinder the SensorThings API uptake in your organisation? And 2) What could better support the adoption of the Standard?
To facilitate the participation of a wider audience, the questions and related answers were allowed in both Italian and English. The lack of knowledge about the standard was recognised as a primary reason blocking its adoption, as seen in Figure 1. For this reason, the recording of the first informative part of the workshop was published online and advertised as part of the Location Innovation Academy and beyond. Additionally, a second SensorThings API webinar was held on April 20, 2023, organised by the Italian Agency for Digitalisation (AgID), and held in Italian.
Figure 2 – Responses to the question “What can mostly support the adoption of the Standard?”
The second question pointed out that the biggest driver for uptake of the SensorThings API Standard would be in the form of clear policy directives. The following ranked factors – digital skills and human resources, respectively – confirm once again the need for training and new professional skills. This problem is the impetus behind the launch of the Location Innovation Academy, developed in the European GeoE3 project and hosted by OGC.
The interest around the SensorThings API Standard continues to grow, with more developments, applications, and implementations expected in the future.
OGC Members interested in staying up to date on the progress of the SensorThings API Standard, or contributing to its development, are encouraged to join the Connected Systems / SensorThings Standads Working Group via the OGC Portal. Non-OGC members who would like to know more about participating in the SWG are encouraged to contact the OGC Standards Program.
The USAGE project has received funding from the European Union’s Horizon Europe Framework Programme for Research and Innovation under the Grant Agreement no 101059950.
The AD4GD project has received funding from the European Union’s Horizon Europe Framework Programme for Research and Innovation under the Grant Agreement no 101061001.
The post OGC SensorThings API for European Green Deal Data Spaces appeared first on Open Geospatial Consortium.
-
12:52
OGC Forms new GeoDCAT Standards Working Group
sur Open Geospatial Consortium (OGC)The Open Geospatial Consortium (OGC) is excited to announce the formation of the OGC GeoDCAT Standards Working Group (GeoDCAT SWG)
The GeoDCAT SWG will revise, publish, and maintain GeoDCAT – a spatio-temporal profile of the W3C DCAT Recommendation – and provide guidance about its use and further specialization. The larger geospatial community will benefit from the standardization of descriptions of geospatial data and access services in DCAT-based data catalogs.
DCAT, a vocabulary to describe datasets and services, is the primary means to catalog datasets on the web. Some basic temporal and geographic properties have been adopted within DCAT v2 and planned v3, however these do not address the full range of requirements as identified in the 2019 OGC GeoDCAT-AP Discussion paper.
GeoDCAT will provide a standardized vocabulary and encoding for spatial dataset descriptions and service descriptions (metadata records), based on general Web standards as described in the OGC/W3C Spatial Data on the Web Best Practices. GeoDCAT could in future be used as an encoding in catalog API standards, such as OGC API – Records and STAC.
GeoDCAT enables spatial data to abide by FAIR (Findable, Accessible, Interoperable, and Reusable) Principles in a web-native environment, just as so many other datasets do. GeoDCAT’s European profile (GeoDCAT-AP) is used to make spatial datasets, dataset series, and services discoverable on general data portals, thereby making geospatial information better findable across borders and sectors. The EU references GeoDCAT-AP as a “Good Practice”.
Portals that describe their data catalogs using either GeoDCAT or GeoDCAT-AP will be interoperable with each other as well as with general data catalogs that use DCAT.
OGC Members interested in staying up to date on the progress of this standard, or contributing to its development, are encouraged to join the GeoDCAT SWG via the OGC Portal. Non-OGC members who would like to know more about participating in this SWG are encouraged to contact the OGC Standards Program.
The post OGC Forms new GeoDCAT Standards Working Group appeared first on Open Geospatial Consortium.
-
12:50
OGC Forms new GeoDataCube Standards Working Group
sur Open Geospatial Consortium (OGC)The Open Geospatial Consortium (OGC) is excited to announce the formation of the OGC GeoDataCube SWG.
The GeoDataCube SWG will improve interoperability between existing datacube solutions, simplify the interaction with different datacubes, and facilitate the integration of data from multiple datacube sources. By following a user-centric approach, the SWG will develop solutions that meet the needs of scientists, application developers, and API integrators.
The goal of the OGC GeoDataCube SWG is to create a new API specifically to serve the core functionalities of GeoDataCubes, such as access and processing, and to define exchange format recommendations, profiles, and a metadata model. The SWG also aims to analyze usability of existing Standards and identify use cases.
Similar to other OGC APIs, the GeoDataCube SWG will create this new standard from existing building blocks, such as existing geospatial Standards, outputs from previous OGC Collaborative Solutions & Innovation Initiatives, and other developer resources in a very use-case driven approach, i.e., with a small core and possible extensions. This will allow for interoperability across future OGC Standards.
With regards to existing and emerging OGC standards, the working group may look specifically at:
- OGC API – Environmental Data Retrieval: A family of lightweight interfaces to access Environmental Data resources.
- OGC API – Coverages: Defining a Web API for accessing coverages that are modeled according to the Coverage Implementation Schema.
- OGC Analysis Ready Data SWG products: proposed Standards to describe specific product types that are often implemented as GeoDataCubes.
- OGC API – Processes: Supporting the wrapping of computational tasks into executable processes that can be offered by a server through a Web API.
- Zarr: An OGC Community Standard for the storage of multi-dimensional arrays of data.
- GeoTIFF and Cloud Optimized GeoTIFF: A format used to share geographic image data.
- Hierarchical Data Format (HDF5): A set of formats designed to store and organize large amounts of data.
The GeoDataCube SWG will follow an agile methodology with the goal to create a first core Standard within the first year. Subsequent iterations may add additional functionality. The GeoDataCube SWG will start with a use case collection and analysis phase that further informs the selection of additional starting points or other work to be considered. The targeted use cases shall reflect real world scenarios, though should allow for a rapid implementation of the GeoDataCube standards without adding unnecessary complexity.
OGC Members interested in staying up to date on the progress of this standard, or contributing to its development, are encouraged to join the GeoDataCube SWG via the OGC Portal. Non-OGC members who would like to know more about participating in this SWG are encouraged to contact the OGC Standards Program.
The post OGC Forms new GeoDataCube Standards Working Group appeared first on Open Geospatial Consortium.
-
12:44
OGC Forms new Agriculture Information Model Standards Working Group
sur Open Geospatial Consortium (OGC)The Open Geospatial Consortium (OGC) is excited to announce the formation of the OGC Agriculture Information Model Standards Working Group (AIM SWG).
The purpose of the AIM SWG is to develop, publish, and maintain an Agriculture Information Model (AIM) to support interoperability of information in the Agriculture Domain, with emphasis on the re-use of generic OGC standards as appropriate.
AIM will provide a common language for agriculture applications to harmonize and improve data and metadata exchange by defining the required data elements, including concepts, properties, and relationships relevant to agriculture applications, as well as their associated semantics/meaning for information exchange.
AIM will be a multi-tier and modular domain model that aligns, profiles, and/or extends well known agriculture-related and generic ontologies, including those published by OGC. Publishing such a domain model as a modular ontology is a new and innovative approach. As such, the SWG will identify best practices for this approach, as well as develop a series of complementary models.
AIM will be published as both human and implementation-ready machine-actionable resources. Machine-actionable resources include the canonical ontology representation of the AIM in the Web Ontology Language (OWL) as well as other related artifacts to support implementation.
The SWG will develop implementations of the AIM model compatible with OGC APIs, including:
- JSON schemas supported by OGC APIs;
- JSON-LD contexts allowing identification and validation of AIM-compliant data; and
- SHACL shapes that enable the validation of data against AIM semantics.
In addition, other forms may be derived or supplied to support the reuse of the AIM model according to requirements identified by the SWG. For example:
- UML representation of the AIM conceptual model;
- UML representation of one or more logical models for AIM implementation; and
- Formal profiles for implementation of AIM using GeoJSON, FG-JSON, CoverageJSON, and other relevant generic schemas.
In line with OGC policies and FAIR principles, the AIM will be published using persistent and resolvable URI identifiers, consistent with OGC Naming Authority processes for publishing semantic resources.
OGC Members interested in staying up to date on the progress of the Agriculture Information Model, or contributing to its development, are encouraged to join the AIM SWG via the OGC Portal. Non-OGC members who would like to know more about participating in this SWG are encouraged to contact the OGC Standards Program.
The post OGC Forms new Agriculture Information Model Standards Working Group appeared first on Open Geospatial Consortium.
-
12:41
OGC Forms new Analysis Ready Data Standards Working Group
sur Open Geospatial Consortium (OGC)The Open Geospatial Consortium (OGC) is excited to announce the formation of the OGC Analysis Ready Data Standards Working Group (ARD SWG).
The ARD SWG, in partnership with ISO/TC 211, will develop a multi-part Standard for geospatial Analysis Ready Data that builds upon work undertaken in the Committee on Earth Observation Satellites (CEOS) Land Surface Imaging Virtual Constellation (LSI-VC) and Analysis Ready Data (ARD) Oversight Group, OGC Disaster Pilot 2021, and OGC Testbed-16.
The concept of ARD was initially developed by CEOS, which defines ARD as “satellite data that have been processed to a minimum set of requirements and organized into a form that allows immediate analysis with a minimum of additional user effort and interoperability both through time and with other datasets.” Adopting the CEOS-ARD definition as a starting point, the OGC ARD SWG will extend the scope of ARD from satellite data to all geospatial data.
A major strength of geospatial and location technologies is their ability to integrate and analyze data from diverse providers concerning many different phenomena so as to better understand or predict what is happening in a given area. However, this diversity of data means that preparing the acquired data for integration and analysis remains a time-consuming task. Furthermore, many geospatial data users lack the expertise, infrastructure, and internet bandwidth to efficiently and effectively access, preprocess, and utilize the growing volume of geospatial data available for local, regional, and national decision-making.
The charter supporters recognize that formal Standardization of the concepts developed through CEOS-ARD is necessary to achieve broad uptake, particularly by the commercial sector. Defining ARD through international standards bodies, such as OGC and ISO, will also help promote the concept and help avoid the divergence that can be caused by various groups working towards different interpretations of the concept.
As such, the OGC ARD SWG, which includes a number of CEOS representatives, will work jointly with the planned ISO/TC 211 ARD Standard project team to define a multi-part Standard that specifies a set of minimum requirements that a geospatial data product shall meet in order for the product to qualify as an ARD product.
The CEOS-ARD concept and Specifications were tested, evaluated, and assessed by OGC during Testbed-16, results of which were published in the ARD Engineering Report in 2020. Building on this, space agencies participating in the OGC Disaster Pilot 2021, introduced a number of CEOS-ARD products into the disaster decision-making process, greatly simplifying the use of satellite data in disaster-related decision making.
The Disaster Pilot 2021 concluded with a need to broaden the ARD concepts to cover other types of geospatial data and to create international ARD standards through the formal standard-setting processes of ISO and OGC. Therefore, the OGC Disaster Pilot 2022 set an action item to start an OGC ARD Standards Working Group to work jointly with the project team in ISO TC 211 to develop joint ISO-OGC standards on geospatial ARD.
OGC Members interested in staying up to date on the development of the Analysis Ready Data Standard, or contributing to its development, are encouraged to join the Analysis Ready Data SWG via the OGC Portal. Non-OGC members who would like to know more about participating in this SWG are encouraged to contact the OGC Standards Program.
The post OGC Forms new Analysis Ready Data Standards Working Group appeared first on Open Geospatial Consortium.
-
15:00
OGC to host European Innovation Days and more at Data Week Leipzig ‘23
sur Open Geospatial Consortium (OGC)The Open Geospatial Consortium (OGC), working with Digital Unit of the City of Leipzig, is excited to host several sessions at Data Week Leipzig ‘23, which runs from June 26-30, 2023, in Leipzig, Germany.
Data Week Leipzig is an innovative networking and exchange event that highlights scientific, economic, and social perspectives of data and its use, and where industry, citizens, science, and public authorities can enter into dialogue. Special topics of Data Week Leipzig 2023 are the European Green Deal, NetZero Cities, and sustainable, resilient development. Digital strategies will be presented and discussed from the European to the local Leipzig City level.
From Monday to Wednesday, OGC will present across topics such as Data Spaces, The European Green Deal, Urban Platforms, Semantics, FAIR data and services, and some of the numerous EC-funded projects that OGC is involved with.
In addition to discussions surrounding EU Data Spaces, you are invited to extend your abstract and presentation and submit a scientific paper for a special issue of the open access journal Remote Sensing with the theme “Earth Observation Data in Environmental Data Spaces.”
Monday at Data Week Leipzig ‘23 has the theme of “European Innovation Day” and will paint the Data Spaces vision that Europe is moving towards with the EU Digital Strategy. Across keynotes, demos, and podium discussions, political frames will be discussed in line with technical possibilities and challenges. Findings of the GeoE3 project will be one of the central topics of this day. Several sessions on Data Spaces and Digital Twins will be held across Monday and Tuesday, while Wednesday will focus on Semantics and the “OGC Rainbow” – which represents the state-of-the-art in the context of semantic interoperability.
OGC and the other Data Week Leipzig organisers welcome attendees to Leipzig to learn and share experiences about the concepts and visions of Data Spaces, Urban Platforms, and other technical solutions that follow the FAIR principles that ensure data remains Findable, Accessible, Interoperable, and Reusable.
A special session is dedicated to the recently launched Location Innovation Academy – a free online training program based on the knowledge and ideas generated by the European GeoE3 project. The academy offers a comprehensive set of modules that support modern and cross-border management, integration, processing, and sharing of geospatial data. Courses are available for beginners through to experienced technical experts who want to deepen their skills in, for example, the OGC API family of Standards. The current content of the modules has been produced by an international team of experts from across Europe.
See the full event schedule and register at the Data Week Leipzig website.
The post OGC to host European Innovation Days and more at Data Week Leipzig ‘23 appeared first on Open Geospatial Consortium.
-
15:00
OGC Adopts 3D Tiles v1.1 as Community Standard
sur Open Geospatial Consortium (OGC)The Open Geospatial Consortium (OGC) is excited to announce that the OGC Membership has approved version 1.1 of 3D Tiles for adoption as an official OGC Community Standard. 3D Tiles is used for sharing, visualizing, fusing, and interacting with massive heterogenous 3D geospatial content across desktop, web, mobile – and now metaverse – applications.
Previously referred to as “3D Tiles Next,” Version 1.1 of the 3D Tiles Community Standard is designed for streaming high-resolution, semantically-rich 3D geospatial data to the metaverse. 3D Tiles 1.1 promotes several 3D Tiles 1.0 extensions to ‘core’ and introduces new glTF™ extensions for fine-grained metadata storage. The OGC Community Standard is identical to the Cesium release of version 1.1 of the 3D Tiles specification.
“The collective community experience building with 3D Tiles since 2015, combined with the continued growth of 3D geospatial data availability, especially semantic metadata, and increasing user interest in digital twins and the metaverse has led to this next generation of the 3D Tiles specification,” said Patrick Cozzi, CEO of Cesium. “3D Tiles 1.1 being recognized as an OGC Community Standard encourages an open ecosystem for the benefit of all where the community can leverage the open specification to advance the 3D geospatial industry.”
The primary enhancements in the 3D Tiles version 1.1 include:
- Semantic metadata at multiple granularities;
- Implicit tiling for improved analytics and random access to tiles;
- Multiple contents per tile to support layering and content groupings; and
- Direct references to glTF™ content for better integration with the glTF™ ecosystem.
3D Tiles 1.1 is backwards compatible with 3D Tiles 1.0: valid 1.0 tilesets are also valid 1.1 tilesets.
3D Tiles was first announced at SIGGRAPH in 2015, and was published as an OGC community standard in 2019. Since then, the community has built apps, exporters, APIs, and engines with 3D Tiles to grow an open and interoperable 3D geospatial ecosystem. This collective experience building with 3D Tiles, combined with the continued growth of 3D geospatial data availability, especially semantic metadata, and increasing user interest in digital twins and the metaverse, has led to this revision of the 3D Tiles specification.
A Community Standard is an official standard of OGC that is developed and maintained external to the OGC. The originator of the standard brings to OGC a “snapshot” of their work that is then endorsed by OGC membership as a stable, widely implemented standard that becomes part of the OGC Standards Baseline.
As with any OGC standard, the open 3D Tiles v1.1 OGC Community Standard is free to download and implement. Interested parties can learn more about the standard on OGC’s 3D Tiles Community Standard Page.
The post OGC Adopts 3D Tiles v1.1 as Community Standard appeared first on Open Geospatial Consortium.
-
15:00
OGC API – Common – Part 1: Core Adopted as Official OGC Standard
sur Open Geospatial Consortium (OGC)The Open Geospatial Consortium (OGC) is excited to announce that the OGC Membership has approved Version 1.0 of the OGC API?-?Common?-?Part 1: Core specification for adoption as an official OGC Standard.
In recent years, OGC has extended its suite of Standards to include Resource Oriented Architectures and Web APIs. In the course of developing these Standards, some practices proved to be common across multiple OGC Web API Standards. These common practices are documented in the multi-part OGC API?- Common Standard for use as foundational building blocks in the construction of other OGC Standards that relate to Web APIs.
The Standard seeks to establish a solid foundation that can be extended by other resource-specific Web API Standards. This consistent foundation for Standards development will result in a modular suite of coherent API standards that can be adapted by a system designer for the unique requirements of their system.
The OGC API?-?Common – Part 1: Core Standard provides the fundamental rules for implementing a Web API that conforms to OGC design parameters. First, this OGC Standard establishes rules for the use of HTTP protocols and Uniform Resource Identifiers (URIs), regardless of the resources being accessed. It then enables discovery operations directed against a Web API implementation, such as identifying the hosted resources, defining conformance classes, and providing both human and machine-readable documentation of the API design.
The requirements specified in the Standard are envisaged to be applicable to any Web API implementation. Indeed, at the time of publication of OGC API?-?Common?-?Part 1: Core, the standard has already been validated by several other OGC API Standards, including OGC API – Environmental Data Retrieval and OGC API – Tiles. This ‘validate-first’ approach has ensured that OGC API?-?Common?-?Part 1: Core presents a harmonized view of the building blocks that are common to all OGC API Standards.
The OGC API – Common – Part 1: Core Standard document is specified as a series of building blocks advertised through a building blocks register. Future parts of the Standard will provide further building blocks that extend the functionality. For example, future parts will document how to organize and describe collections of resources, or how to define operations for the discovery and selection of individual collections.
To enable software developers to rapidly implement products that support OGC API – Common, example API definition files and associated schemas are available on the OGC API – Common website. The API definition files conform to Version 3.0 of the OpenAPI Specification, and thus can be easily integrated into many of the Web APIs that are described using the OpenAPI Specification.
Anyone interested in following the future development of OGC API – Common is welcomed to engage with the OGC API – Common GitHub Repository. OGC Members interested in staying up to date on the progress of this standard, or contributing to its development, are encouraged to join the OGC API – Common SWG SWG via the OGC Portal.
As with any OGC Standard, Version 1.0 of the OGC API – Common – Part 1: Core Standard is free to download and implement.
The post OGC API – Common – Part 1: Core Adopted as Official OGC Standard appeared first on Open Geospatial Consortium.
-
17:49
OGC to Present at EGU General Assembly 2023
sur Open Geospatial Consortium (OGC)The Open Geospatial Consortium (OGC) is excited to attend and present at the upcoming European Geoscience Union General Assembly 2023 (EGU23), taking place next week, 23-28 April, in Vienna, Austria.
OGC staff will present in various sessions at EGU23, discussing our Members’ and Partners’ work – including several European Commission and Horizon 2020 projects – conducted under OGC’s Collaborative Solutions and Innovation (COSI) Program (formerly the Innovation Program).
In addition to these presentations, OGC Staff will also be on-hand at the Open Data Help desk, where they will be available to answer all kinds of questions about Open Data, Open Standards, and the applications & services that are made possible by them.
OGC Staff at the Open Data Help Desk will also demonstrate the Location Innovation Academy, which launched earlier this month. The Location Innovation Academy is a free online e-learning platform, developed in the context of the Geospatially Enabled Ecosystem for Europe (GEOE3) project, that offers many tutorials on how to handle data and implement services according to FAIR principles and using OGC technologies.
A session at EGU23 specifically dedicated to FAIR principles will be co-chaired by OGC on Wednesday afternoon: ESSI1.8 – Challenges and Opportunities for Findable, Accessible, Interoperable and Re-usable Training Dataset.
Attendees will have the opportunity to see and discuss the latest developments in such OGC-related projects as:
- e-shape: the flagship European project bringing together key European actors to ensure the optimal implementation of EuroGEO and the delivery of EO-based benefits to a wide range of stakeholders;
- CLINT: is developing an AI/ML framework for processing large climate datasets and help enable climate scientists to better identify causes of extreme events;
- FMSDI, the multi-phase OGC Initiative with the goal of making Marine Spatial Data Infrastructures more powerful and user-oriented;
- Iliad, an interoperable, data-intensive, and cost-effective Digital Twin of the Ocean;
- InCASE, how FAIR principles can be applied to in-situ data to support European environmental monitoring activities, especially around climate adaptation policies;
- and more.
For those not able to attend the EGU conference, the upcoming OGC Member Meeting in Huntsville, Alabama, USA, 5-9 June will be of interest. The next European conference with a strong OGC presence will be the DataWeek Leipzig, Germany on 26-28 June, where OGC is organizing the European Innovation Days event in the context of GEOE3. There the ongoing discussion around the European data spaces will be continued and refined.
When and where to find OGC at EGU23:Monday, April 24
- 16:15-18:00 Session GI6.2The Remote Sensing and UASs approaches in Geoscience Research Platforms for the 21st century.
- Talk: Common mission planning and situation awareness model for UxS Command and Control systems Teodor Hanchevici, Piotr Zaborowski et al.
Tuesday, April 25
- 14:00-15:45 Session ESSI3.1 – In-situ Earth observation and geospatial data sharing and management as key basis for the climate emergency understanding.
- Poster in Hall X4 at board number X4.229: EGU23-6359: G-reqs: How a user requirements system in GEO can improve the in-situ data availability? by Alba Brobia et al.
- Poster in Hall X4 at board number X4.226: EGU23-11480: New Resources promoting the GEO Data Sharing and Management, FAIR, and CARE principles by Marie-Francoise Voidrot et al.
Wednesday, April 26
- 8:30-10:10 Session ESSI 2.2 – Data Spaces: Battling Environmental and Earth Science Challenges with Floods of Data.
- Talk: Environmental data value stream as traceable linked data – Iliad Digital Twin of the Ocean case Piotr Zaborowski et al.
- 16:15-18:00 Session ESSI1.8 – Challenges and Opportunities for Findable, Accessible, Interoperable and Re-usable Training Dataset. Entire session is chaired by OGC (Nils Hempelmann et al).
The post OGC to Present at EGU General Assembly 2023 appeared first on Open Geospatial Consortium.
-
15:00
Developers Invited to the 2023 OGC Tiling Interfaces Code Sprint
sur Open Geospatial Consortium (OGC)The Open Geospatial Consortium (OGC) invites software developers to the OGC Tiling Interfaces Code Sprint, to be held online and in-person from June 12-14, 2023, at the Moonshot Labs of the National Geospatial-Intelligence Agency (NGA) in St Louis, Missouri, USA. Participation in the code sprint is free. Registration for in-person participation closes at 5pm EDT on May 10. Registration for remote participation will remain open throughout the code sprint.
An OGC Code Sprint is a collaborative and inclusive event to support the development of new applications and open standards, as well as to enable software developers to focus on projects that implement open geospatial standards.
OGC Code Sprints experiment with emerging ideas in the context of geospatial standards, help improve interoperability of existing standards by experimenting with new extensions or profiles, and are used for building or enhancing software products to implement the standards.
Newcomers to OGC Standards are welcomed at the sprint. The mentor stream is designed to bring developers up-to-speed on the Standards and Projects that form the focus of the sprint, while also providing them with know-how and experience that they can use and build upon even after the sprint has concluded.
The code sprint will focus on the following Standards and specifications:
- OGC API – Tiles defines building blocks for creating Web APIs that support the retrieval of geospatial information as tiles.
- OGC API – Maps describes an API that can serve spatially referenced and dynamically rendered electronic maps.
- Changesets API is based on outcomes from OGC Testbed-15 and provides the foundation for a ‘Transactional Tiles API Extension’ for OGC API – Tiles.
- Vector Tiles Extension to GeoPackage that describes a prototype extension of the OGC GeoPackage Standard to support the use of vector tiles technology.
- Variable Width Tile Matrix is a grid suited for the whole globe that keeps the data in a geographic Coordinate Reference System.
- Web Map Tile Service (WMTS) defines a web service that can serve map tiles of spatially referenced data using tile images with predefined content, extent, and resolution. Developers of implementations of the OGC WMTS Standard, DGIWG WMTS profile, and the NSG WMTS profile are particularly encouraged to attend.
Also encouraged at the Code Sprint are non-coding activities such as testing, working on documentation, or reporting issues.
The code sprint begins at 9am EDT on June 12 with an onboarding session, and ends at 5pm EDT each day. Registration is open now. Registration for in-person participation closes at 5pm EDT on May 10. Registration for remote participation will remain open throughout the code sprint.
Learn more about the Code Sprint, including venue information and how to register on the 2023 OGC Tiling Interfaces Code Sprint website.
To learn more about future and previous OGC code sprints, visit the OGC Code Sprints webpage or join the OGC-Events Discord Server.
The post Developers Invited to the 2023 OGC Tiling Interfaces Code Sprint appeared first on Open Geospatial Consortium.
-
18:34
A recap of the 125th OGC Member Meeting, Frascati, Italy
sur Open Geospatial Consortium (OGC)From February 20-24, 2023, more than 200 geospatial experts from industry, government, and academia traveled to Frascati, just outside of Rome, Italy, for OGC’s 125th Member Meeting – with over a hundred more attending virtually. Hosted and sponsored by the European Space Agency (ESA), the meeting had the theme of “Space and Geospatial.”
The week saw the usual abundance of Standards Working Group and Domain Working Group meetings, networking & social events, the annual Gardels Award presentation, and an abundance of special sessions, including a Climate Resilience Pilot demonstration; a Marine Special Session; an Intelligent Transportation Systems ad hoc; a Space Standards ad hoc; a Health Summit; a GeoDatacube ad hoc; a Geotech Interoperability Experiment (IE) session; an OGC Startups and Scaleups session, and the OGC Europe Forum.
Two areas of focus during the meeting included Space Standards and Connected Systems.
Space Standards: OGC members are discussing the requirements for standardization of space data and environments of operation. Such Standards could represent location reference systems beyond the Earth, help track and locate space debris, and provide a reference for space weather data. Expect to see refinement of work from Testbed-18 in the coming year as reflected in both experimentation and standardization. See below for a brief overview of the Space Standards ad hoc, as well as a link to presentations and a recording.
Connected Systems: OGC has long published Standards and Best Practices associated with sensors and sensor networks. Some of these Standards have long been used in government and private sector systems, but may be less responsive to current trends in IT. OGC’s SensorThings API – developed as a lightweight interface to Internet of Things devices and the OGC APIs – shows a pattern for RESTful integration of geospatial data. The SensorML SWG has been recharted as the Connected Systems SWG and will be working with legacy and emerging OGC sensor Standards and concepts to create a framework for modern sensor integration.
The full agenda for the 125th OGC Member Meeting is available here. Read on for an overview of the best bits, below.
OGC CEO Dr. Nadine Alameh presenting her CEO update It Begins
The Monday morning Kick-off Session welcomed attendees and provided an overview of what’s been happening at OGC since the previous the 124th OGC Member Meeting, as well as an insightful presentation by Dr. Rune Floberghagen from meeting hosts, ESA.
Dr. Rune Floberghagen, Head of the Science, Applications and Climate Department in the Directorate for Earth Observation Programmes, ESA, provided an overview of ESA programs, with a heavy focus on the use of and contribution to OGC Standards and activities as well as highlighting numerous cases of the practical application of OGC Standards and practices.
Dr. Rune Floberghagen, Head of the Science, Applications and Climate Department in the Directorate for Earth Observation Programmes, ESA
OGC’s Dr. Nadine Alameh then provided her CEO Update, which included the announcement of the former OGC’s Innovation Program renaming to a name that better describes its activities: the OGC Collaborative Solutions and Innovation (COSI) Program.
Nadine then held her regular fireside chat, this time chatting with Amanda Morgan, Technical Executive for GEOINT and IT Standards, US NGA. The two discussed the need to highlight the value of Standards and provided some guidance and insight to young professionals looking to progress their careers.
I, Scott Simmons, Chief Standards Officer of OGC, then took to the stage to provide some logistics information for the week and welcomed the delegation from the State Service of Ukraine for Geodesy, Cartography and Cadastre, whose membership has been sponsored by Strategic Member Natural Resources Canada and travel to this Member Meeting offset by Strategic Members UK Hydrographic Office and Ordnance Survey. Many thanks go out to these members for their generosity.
Trevor Taylor, Senior Director, Member Success and Development at OGC, welcomed new members and detailed the new OGC membership model now in place.
Finally, Jonathan Fath, OGC’s Director, Marketing and Promotions, revealed the new OGC public website, which, as you’re here reading this, I’m going to guess you’re aware of.
Networking, Dinner, and the Gardels AwardOn top of the conversations in the hallways, over lunch, and during meetings, no OGC Member Meeting is complete without the Monday evening networking drinks, nor Wednesday night’s Networking Dinner. Held at Ristorante Cacciani in Frascati, the Networking Dinner also included the presentation to Steve Liang of the 2022 Gardels Award. Steve was chosen for the award for his work in sensor technologies for geospatial and pioneering web standards for publishing spatial data. Congratulations, Steve!
OGC 2022 Gardels Award winner, Steve Liang (L), OGC CEO Dr. Nadine Alameh (C), and OGC Chief Standards Officer, Scott Simmons (R) Meeting Special Sessions
Special sessions at this Member Meeting included a Climate Resilience Pilot demonstration; Marine Special Session; Intelligent Transportation Systems ad hoc; Space Standards ad hoc; Health Summit; GeoDatacube ad hoc; Geotech Interoperability Experiment (IE) session; OGC Startups and Scaleups session, and the OGC Europe Forum. Read on for an overview of each – as well as a link for members to access the recording and slides for each.
Climate Resilience Pilot Demo: A demonstration session was held by the participants in the OGC Climate Resilience Pilot to highlight the topics that will be addressed in the newly started Pilot. OGC Members can access the presentations and a recording on this page in the OGC Portal.
Marine Special Session: Marine topics continue to be of high interest to OGC members. A half-day special session was held to discuss “Connecting Land and Sea.” The first half of the session focused on past Innovation projects such as the Phases I & II of the Federated Marine Spatial Data Infrastructure (FMSDI) Pilot and marine matters addressed in OGC Testbed-18. The second half summarized the work of Phase III of the FMSDI Pilot and presented an Engineering Report documenting the work. OGC Members can access the presentations and a recording on this page in the OGC Portal.
Intelligent Transportation Systems ad hoc: Discussions about intelligent infrastructure have occurred at several past OGC Member Meetings, mostly concerning road networks. This effort continued in Frascati with an Intelligent Transportation Systems session that focused on the ISO / TC 204 Geographic Data Files Standard and how OGC might become more involved in working on that Standard in the future. OGC Members can access the presentations and a recording on this page in the OGC Portal.
Space Standards ad hoc: OGC’s Testbed-18 included work on “Space Standards.” As might seem obvious, the scope and nature of Standards that apply to space beyond the Earth is open to quite broad interpretation. A special session was held to identify the landscape of likely standardization activities in the space domain from US NGA, ESA, European Community Defence, and supporting researchers in Europe and North America. The outputs of this session will be documented to help identify priorities for further investigation. OGC Members can access the presentations and a recording on this page in the OGC Portal.
Health Summit: Wednesday saw a day-long Health Summit. Organized by the OGC Health DWG, the summit sought to “provide a forum for geo/location experts to learn what clinicians need in order to advance the practice of healthcare, and for clinicians to explore the possibilities unleashed when location/geo data is incorporated as part of solutions.”
Also included in the summit was a “Startup, space, and geospatial panel” consisting of Pablo Fuentes from MakePath, Ugo Celestino from the European Commission, Gianluigi Baldesi, PhD from ESA, Steve Liang from SensorUp, and moderated by Anilkumar Dave, VC Fund Partner & Space Economy Advisor. OGC Members can access the presentations and a recording on this page in the OGC Portal.
The Startup, Space and Geospatial panel, consisting of: Pablo Fuentes, MakePath (not shown), Ugo Celestino, EC (L), Gianluigi Baldesi, PhD, ESA (M), Steve Liang, SensorUp (R), and moderated by Anilkumar Dave, VC Fund Partner & Space Economy Advisor (R)
GeoDatacube ad hoc: Multidimensional data associated with geospatial information (referred here as GeoDatacubes) can be encoded and utilized in a number of formats, including existing OGC Standards such as NetCDF, HDF5, Coverage Implementation Schema, Zarr, and more. A GeoDatacube SWG has been proposed to identify a possible API for accessing such data and Standards and Metadata required to bridge across encodings. OGC Members can access the presentations and a recording on this page in the OGC Portal. You may also want to read a blog post outlining the 2021 Towards Data Cube Interoperability Workshop on the OGC Blog here: A User-centric Approach to Data Cubes.
Geotech Interoperability Experiment (IE) session: OGC members have organized the GeoTech Interoperability Experiment (IE) to develop a common model for describing geotechnical engineering data that is spatially-referenced. This experiment is also intended to link the information communities from the OGC with those from buildingSMART International and a variety of geotechnical engineering organizations. A special session of the Geoscience DWG was held to discuss the results to date of this IE. OGC Members can access the presentations and a recording on this page in the OGC Portal.
OGC Startups and Scaleups session: OGC Member Meetings include a session that allows Startup members to highlight their capabilities. The “Startups and Scaleups” session in Frascati included presentations from SpaceSense, Prométhée, Duality Robotics, and Terrsigna. OGC Members can access the presentations and a recording on this page in the OGC Portal.
OGC Europe Forum: Most OGC Member Meetings include a session dedicated to the local regional forum. The OGC Europe Forum met on Friday with a wide variety of presentation topics from throughout the region. Keep your eyes out for an upcoming blog post dedicated to this session and OGC’s work in Europe more broadly. In the meantime, OGC Members can access the presentations and a recording on this page in the OGC Portal.
OGC’s regular “Fireside Chat” with Amanda Morgan, Technical Executive for GEOINT and IT Standards, US NGA (L), and OGC CEO Dr. Nadine Alameh (R) Today’s Innovation/Tomorrow’s Tech and Future Directions
OGC proactively assesses emerging technologies as they apply to the geospatial community. Some technologies will directly impact geospatial standards and interoperability, others lead to new operational environments that will drive requirements and use cases. These tech outlooks and assessments are briefed to OGC Members during the Today’s Innovation/Tomorrow’s Tech and Future Directions session, which runs unopposed on Tuesday morning’s schedule so that all meeting participants can attend.
The session focused on “Future Space Technologies.” OGC’s Dr. Gobe Hobona introduced the session, and was followed by several presentations and a panel.
Tony San José of Satellogic described the use of new satellite constellations for monitoring and assisting response to severe flooding in Albania. The use of cloud storage of imagery products allowed for rapid analysis. The presentation also summarized work from the OGC Disaster Pilot ‘21 where Satellogic performed flood analysis in South America.
Dr. Samantha Lavender of Pixalytics reflected on “the importance of customer perceptions and expectations.” She noted that a goal is to have people use data without knowing that they are using data. An example was provided on the use of machine learning to detect plastic waste, acknowledging that the algorithm required retraining for each region in which it is used.
OGC’s Rob Atkinson revisited the OGC Reference Model (ORM). He stated that “interoperability is not a one size fits all” and that it has many aspects. The current form of the ORM is no longer useful, but the concept of such a model is very much needed. A new approach must be adaptive and is planned to include a Knowledge Graph to identify and link concepts.
The speakers then joined a panel for open discussion on a diverse range of topics related to their presentations and trends in Space. Dr. Lavender and Mr. San José discussed the increasing ability to process information on the satellite that is more ready for use than raw content.
OGC Members can access the presentations and a recording of the session on this page in the OGC Portal.
The Big Hall at the ESA Centre for Earth Observation (ESRIN) Closing Plenary
The Closing Plenary now runs as two sessions: Important things, and the formal Closing Plenary.
Important Things: this session started with my rapid, 15-minute summary of the entire meeting week. Slides and content from a large number of Working Group sessions were included. The presentation is available for OGC Members on this page in the OGC Portal.
The Important Things session then featured a discussion on the future of OGC’s “Definitions Server” suite of registries and linked data capabilities. OGC Staff and Members considered the capabilities under development and debated where these capabilities may be more relevant in OGC outputs. OGC Members can access notes from the session recorded in the Etherpad Important-Things-2023-02.
The Closing Plenary itself is now focused on presentations and voting. Günther Landgraf from ESA provided an inspirational speech on ESA activities that have impacted and been impacted by OGC. The remainder of the session advanced a number of Standards, SWGs, and documents toward vote or publication. Summary slides from the 125th OGC Member Meeting Closing Plenary are available publicly via the OGC Portal.
OGC Chief Standards Officer, Scott Simmons, addresses the audience Thank you to our community
All in all, our 125th Member Meeting was a big success. It was wonderful seeing members interacting, collaborating, and driving technology and standards development forward. It’s especially exciting as it comes at a time when geospatial is truly everywhere – even the far reaches of space. Once again, thank you to our members for their time and energy, as well as their dedication to making OGC the world’s leading and most comprehensive community of location experts.
Be sure to join us for the 126th Member Meeting, happening June 5-9, 2023, at GEOHuntsville in the Huntsville Botanical Garden, AL, USA. Registration and further info is available now on ogcmeet.org. Sponsorship opportunities are also available see this brochure or contact OGC for more info. Be sure to subscribe to the OGC Update Newsletter to stay up to date on all things OGC, including future OGC Member Meetings, funding opportunities, and how to contribute to our open Standards.
The post A recap of the 125th OGC Member Meeting, Frascati, Italy appeared first on Open Geospatial Consortium.
-
12:56
OGC SensorThings API and OGC API – Features: Good Practice for European Spatial Information
sur Open Geospatial Consortium (OGC)Post contributed by: Dr.-Ing. Jürgen Moßgraber
Deputy Head of Department “Information Management and Production Control”
Deputy Speaker of Business Unit “Energy, Environmental and Security Systems”
Fraunhofer Institute of Optronics, System Technologies and Image Exploitation IOSBThe European Commission has a long history of promoting open access to public data across Europe, breaking down electronic barriers at national borders through the creation of common data and service models, as well as through the provision of accompanying legislation to facilitate such endeavours. The INSPIRE Directive, entered into force in May 2007, has been a core building block in this work.
INSPIRE established an infrastructure for spatial information in Europe in support of Community environmental policies and policies or activities that may impact the environment. INSPIRE is based on the infrastructures for spatial information established and operated by the Member States of the European Union. The Directive addresses 34 spatial data themes needed for environmental applications, with key components specified through technical implementing rules.
The API4INSPIRE ProjectThe API4INSPIRE Project investigated new developments in geospatial standards and technologies, foremost the new OGC API – Features and OGC SensorThings API Standards (which are now both INSPIRE Good Practice), together with the outcomes of the INSPIRE MIG Action 2017.2 on alternative encodings for INSPIRE data, evaluating their suitability for use in the European spatial data landscape. This study was funded in the frame of the European Location Interoperability Solutions for e-Government action (ELISE), part of the ISA Programme and was conducted by researchers from Fraunhofer IOSB, GeoSolutions and DataCove, all of whom have extensive experience with OGC standards and services.
The MethodologyFirst, an evaluation strategy was developed to determine how these new and emerging standards can best be utilized to leverage existing investments by EU Member States in the INSPIRE implementation, while also supporting new developments in e-Government and the Digital Single Market. The evaluation methodology was designed to weigh costs and benefits against each other, highlighting both strengths and weaknesses of the APIs being evaluated. Pertaining to benefits, this includes flexibility, developer friendliness, and ease of discoverability, access and use. From a technical point of view, alignment with the current architecture of the Web and the Spatial Data on the Web Best Practices should be assured. Cost considerations, such as infrastructural changes, need for additional expertise, updated tooling, training, re-engineering of existing practices and security aspects were included as well. While quantifiable metrics would have been preferable, the effort entailed in gaining truly representative values would require a level of complexity that outweighs the assumed benefits of such quantification; thus it was decided to focus on qualitative metrics that can be easily abstracted to different operational environments.
The Data ProvidersSix data providers from Germany, France, and Austria contributed data, personnel, and infrastructure to this project and were involved in all phases of the project. Together, these data providers manage data from 14 INSPIRE topics (including air and water quality, and traffic data). Use cases were defined based on the available datasets and the data providers’ experiences with their current data consumers. Priority was given to use cases that included APIs where possible from different vendors. Numerous data sets were made available online as part of the project and further steps were taken to evaluate the usability of the data via these new APIs. This provides insights into both the usability of each API standard and the interoperability between API standards.
The ResultsThe evaluation focused on various facets of usability ranging from the configuration and deployment aspects of service deployment to ease of uptake of the API, and included a wide range of stakeholders within the evaluation process. The outcomes of this investigation were analysed, relevant guidance materials created based on insights gained, and widely disseminated to interested stakeholders. In addition, various Open Source software solutions as well as extensions were developed where gaps were identified (e.g. for GeoServer). For the provision of SensorThings API the Fraunhofer Open Source SensorThings Server (FROST) is already available as a free solution and allows for a much faster implementation of real-world applications. In one of the use-cases data from both the German and French sides of the river Rhine, integrating data from the German State of Baden-Württemberg (LUBW) with that stemming from the French Geological Survey (BRGM) and the French Office for Biodiversity (OFB) provided alternative perspectives on the Rhine (see image).
By integrating data from the German State of Baden-Württemberg (LUBW) with that stemming from the French Geological Survey (BRGM) and the French Office for Biodiversity (OFB) users are able to see a more holistic perspective of the Rhine
A central result of the project is the recommendation of both OGC API – Features and OGC SensorThings API Standards as “good practice” for the provision of INSPIRE data. The documentation of the evaluation method and deployment strategies developed for standards-based APIs, as well as the practical experiences of deploying and using the APIs in the context of the defined use cases, can be accessed on the API4INSPIRE project page.
The post OGC SensorThings API and OGC API – Features: Good Practice for European Spatial Information appeared first on Open Geospatial Consortium.
-
15:00
Announcing the Launch of the Location Innovation Academy
sur Open Geospatial Consortium (OGC)The Open Geospatial Consortium (OGC) is pleased to announce the launch of the Location Innovation Academy – a free online training program based on the knowledge and ideas generated by the European GeoE3 project. A webinar launching the academy is scheduled for April 5; additional webinars will follow.
Make sure you register early!
The academy offers a comprehensive set of modules that support modern and cross-border management, integration, processing, and sharing of geospatial data. Courses are available for beginners through to experienced technical experts who want to deepen their skills in, for example, the OGC API family of Standards. The current content of the modules has been produced by an international team of experts from across Europe.
The constantly evolving online training package currently includes three different courses, from which the learner can choose the modules to develop their skills: Data Management; Service Management; and Data and Service Integration.
The Location Innovation Academy is currently targeted to national mapping agencies, meteorological institutions, and other organizations producing or using geospatial data from different countries. However, in practice anyone can join the academy and start studying. You can start your studies by signing up to the Location Innovation Academy for free.
The Location Innovation Academy developed by the European GeoE3 project serves as the experimentation platform for a future OGC Academy. OGC is involved in GeoE3 as a project partner and will now stepwise enhance the academy together with the Location Innovation Hub.
OGC encourages everyone to explore the academy and provide feedback that allows us to enhance usability, content, and look and feel of the academy. Other projects or organizations are welcomed and encouraged to produce new learning modules as part of the Location Innovation Academy in the future. Feedback can be provided directly on the Location Innovation Academy website.
More information on the academy and the courses offered is available on GeoE3’s Location Innovation Academy information page. The Location Innovation Academy is hosted by OGC at academy.ogc.org.
About GeoE3
GeoE3 is a project co-financed by the Connecting Europe Facility of the European Union that has provided the vital connection between existing and emerging National, Regional, and Cross-Border digital services. The action provides dynamic integration of high-value data sets and services (e.g. meteorological or statistical data) with geospatial features from existing national geospatial data platforms (e.g. building data or road network data). This simplifies meaningful analysis and visualization in a national and cross-border context. By enabling Open Public Data Digital Service Infrastructure, GeoE3 develops fundamental services to better serve the European citizen through the enhanced availability, interoperability, and integration of services.
Learn more at geoe3.euAbout the Location Innovation Hub
The Location Innovation Hub is a center of excellence in location information coordinated by the Finnish Geospatial Research Institute. LIH services are produced in conjunction with a partner network.
The Hub’s activities are aimed at making efficient use of data spaces, for example on developing digital twins. The goal is to create a functional, permanent location innovation ecosystem for seeking funding, testing services, and developing new innovations in Europe.
Learn more at locationinnovationhub.euThe post Announcing the Launch of the Location Innovation Academy appeared first on Open Geospatial Consortium.
-
12:28
The new v5.5 of TEAM Engine on the OGC Validator
sur Open Geospatial Consortium (OGC)OGC has recently released to production version 5.5 of TEAM Engine on the OGC Validator. TEAM Engine is the open source software product used for validating compliance to OGC Standards. The production release of this software follows a period of more than 8 months of Beta testing during which the geospatial community thoroughly tested TEAM Engine and shaped it with their feedback.
Key improvements in this release of TEAM Engine include:
- Enhancements to the REST API, including improvements to the access control mechanism
- Improvements to the user interface and report generation facility
- Improvements to error handling
- Addition of Form Validation for Executable Test Suites
- Updated and improved documentation
Anyone looking to test a product for compliance to OGC Standards can do so, for free, using the hosted OGC Validator. Those implementers looking to run TEAM Engine locally can access the source code on GitHub and either install the software locally or run the software through Docker.
Implementers are encouraged to submit their products for certification once they have confirmed that their products pass the compliance tests. Prices for certification can be found on the Compliance section of the OGC website.
What’s underneath the hood?TEAM Engine is built as a Java web application that runs on Apache Tomcat. The modules containing compliance tests are referred to as Executable Test Suites (ETS), to distinguish them from the Abstract Test Suites (ATS) that are found in the associated Standards documents. An ETS encodes the executable code that implements the testing method specified in an ATS.
Newer compliance tests in TEAM Engine are implemented using the TestNG framework, whereas older compliance tests were implemented using the Compliance Test Language (CTL). TestNG is one of the most popular automated testing frameworks available. CTL is an XML grammar for documenting and scripting test suites for verifying compliance to a specification. TEAM Engine 5.5 supports both TestNG and CTL.
What’s next for TEAM Engine?Now that TEAM Engine v5.5 has been released, multiple outreach events are planned to support the geospatial community with the deployment and use of this new release. The following have been confirmed and several more are in planning:
- TEAM Engine session at the 2023 Open Standards and Open Source Software Code Sprint, April 25-27, 2023. Register on the Code Sprint website.
- Compliance Interoperability & Testing Evaluation (CITE) session at the 126th OGC Member Meeting in Huntsville, Alabama, June 5-9, 2023.
If you have any questions regarding TEAM Engine or the OGC Validator, please contact the OGC Compliance Program.
The post The new v5.5 of TEAM Engine on the OGC Validator appeared first on Open Geospatial Consortium.
-
11:52
Steve Liang receives OGC’s 2022 Gardels Award
sur Open Geospatial Consortium (OGC)Last night at the 125th Open Geospatial Consortium (OGC) Member Meeting, held in Frascati, Italy, Steve Liang was presented the OGC’s prestigious Kenneth D. Gardels Award. The Gardels Award is presented each year to an individual who has made an outstanding contribution to advancing OGC’s vision of building the future of location with community and technology for the good of society.
Steve Liang, Professor and Rogers IoT Research Chair at University of Calgary and Founder and CTO of SensorUp, was selected as the 2022 recipient because of his pioneering work in sensor technologies in the geospatial context.
A member of the nominating committee noted that Steve was “the first within OGC to come up with a more RESTful API and thus one of the pioneers of leveraging web standards for publishing spatial data.” Another committee member noted that Steve has “been a promoter for OGC for some time and has done an exceptional job promoting the OGC to the IoT world. Steve does a lot internally as well as externally as part of this promotion.”
“We thank Steve for chairing the SensorThings SWG, the Sensor Web Enablement DWG, the CDB SWG, and the University DWG,” commented Jeffrey Harris, OGC Chair. “We also appreciate Steve’s efforts as the principal editor for the two SensorThings API Standards and for applying those Standards to numerous real-world installations in a variety of domains. Steve has been instrumental in forging strong links with the geospatial community across the globe. Steve’s skill in bridging communities exemplifies the values associated with the Gardels Award.”
In all this work, Steve exemplifies the highest values of OGC, and has demonstrated the principles, humility, and dedication in promoting spatial technologies to address the needs of humanity that characterized Kenn Gardels’ career and life.
About the OGC Gardels Award
The Kenneth D. Gardels Award is a gold medallion presented each year by the Board of Directors of the Open Geospatial Consortium, Inc. (OGC) to an individual who has made exemplary contributions to OGC’s consensus standards process. Award nominations are made by members – the prior Gardels Award winners – and approved by the Board of Directors. The Gardels Award was conceived to memorialize the spirit of a man who dreamt passionately of making the world a better place through open communication and the use of information technology to improve the quality of human life.
Kenneth Gardels, a founding member and a director of OGC, coined the phrase “Open GIS.” Kenn died of cancer in 1999 at the age of 44. He was active in popularizing the open source Geographic Information System (GIS) ‘GRASS’, and was a key figure in the Internet community of people who used and developed that software. Kenn was well known in the field of GIS and was involved over the years in many programs related to GIS and the environment. He was a respected GIS consultant to the State of California and to local and federal agencies, and frequently attended GIS conferences around the world.
Kenn is remembered for his principles, courage, and humility, and for his accomplishments in promoting spatial technologies as tools for preserving the environment and serving human needs.
More information on the OGC Gardels Award, including previous winners, can be found at: ogc.org/about/gardels-awards.The post Steve Liang receives OGC’s 2022 Gardels Award appeared first on Open Geospatial Consortium.
-
14:51
Developers Invited to the 2023 Open Standards and Open Source Software Code Sprint
sur Open Geospatial Consortium (OGC)The Open Geospatial Consortium (OGC), the Apache Software Foundation (ASF), and the Open Source Geospatial Foundation (OSGeo) invite software developers to attend the annual Open Standards and Open Source Software Code Sprint. The hybrid virtual and in-person event will be hosted by Open Source service company Camptocamp SA in Bussigny, Switzerland (near Lausanne). The event will take place April 25-27, 2023. Participation is free and open to the public.
The OGC/ASF/OSGeo Open Standards and Open Source Software Code Sprint is a collaborative and inclusive event that aims to support the development of new applications and open standards, and encourage software developers to focus on projects that implement open geospatial standards.
The Code Sprint will cover multiple ASF and OSGeo projects, as well as related OGC Standards – including OGC API Standards. The Code Sprint is open to the general public, even if you are not active members or contributors to the efforts of the hosting organizations.
These annual code sprints experiment with emerging ideas in the context of geospatial standards, help improve interoperability of existing standards by experimenting with new extensions or profiles, and provide a space to learn more about open standards and open source software while actively building or enhancing software products that implement the standards.
Non-coding activities, such as testing, working on documentation, or reporting issues, are also welcomed at the Code Sprint. In addition, the Code Sprint’s mentor stream provides an opportunity for developers new to open standards and software to learn how to use the standards & projects and build an understanding that will serve them beyond the code sprint.
The Code Sprint is generously sponsored at the Gold Level by OGC Strategic Member Ordnance Survey. Catering is sponsored collectively by HEIG-VD (School of Engineering and Management), EPFL, University of Lausanne, State of Neuchâtel, State of Vaud, and Camptocamp.
Registration for in-person participation closes at 17:00 CEST on April 18. Registration for remote participation will remain open throughout the code sprint. Registration is available on The Code Sprint website.
A detailed schedule of this code sprint can be found on the event Wiki.
Event Sponsorship
Organizations are invited to sponsor the Code Sprint. A range of packages are available offering different opportunities for organizations to support the geospatial development community while promoting their products or services. Visit the Event Sponsorship page for more information.To learn more about this, future, and previous OGC code sprints, visit the OGC Code Sprints webpage, the OGC developer events wiki, or join the OGC-Events Discord Server.
About The Apache Software Foundation
Since 1999, The Apache Software Foundation has been shepherding, developing, and incubating Open Source innovations “The Apache Way”. The ASF’s all-volunteer community comprising 816 individual Members and 8,500 Committers on six continents steward 227M+ lines of code, oversee 350+ Apache projects and their communities, and provide $22B+ worth of software to the public at 100% no cost.About OSGeo
The Open Source Geospatial Foundation (OSGeo) is a not-for-profit organization whose mission is to foster global adoption of open geospatial technology by being an inclusive software foundation devoted to an open philosophy and participatory community driven development. Since the renewal of the OSGeo/OGC Memorandum of Understanding (MOU) in 2022, OSGeo operates as a Community Member in the OGC Technical Committee.About OGC
The Open Geospatial Consortium (OGC) is a collective problem-solving community of experts from more than 500 businesses, government agencies, research organizations, and universities driven to make geospatial (location) information and services FAIR – Findable, Accessible, Interoperable, and Reusable.
The global OGC Community engages in a mix of activities related to location-based technologies: developing consensus-based open standards and best-practice; collaborating on agile innovation initiatives; engaging in community meetings, events, and workshops; and more.
OGC’s unique standards development process moves at the pace of innovation, with constant input from technology forecasting, practical prototyping, real-world testing, and community engagement.
OGC bridges disparate sectors, domains, and technology trends, and encourages the cross-pollination of ideas between different communities of practice to improve decision-making at all levels. OGC is committed to creating an inclusive and sustainable future.
Visit ogc.org for more info on our work.About Camptocamp
Camptocamp is an Open Source IT service company that is committed to implementing the most pragmatic and sustainable software solutions to allow its clients to thrive in the fast-paced technological landscape. As pioneers in applying Open Source technologies to support our customers’ digital strategies, Camptocamp is recognized for its expertise offering innovative Open Source solutions in the areas of Enterprise Resource Planning, IT Infrastructure Management, and Geographic Information Systems.The post Developers Invited to the 2023 Open Standards and Open Source Software Code Sprint appeared first on Open Geospatial Consortium.
-
15:00
CAE upgrades OGC Membership to Principal Level
sur Open Geospatial Consortium (OGC)The Open Geospatial Consortium (OGC) is pleased to announce that CAE has raised its membership of OGC to Principal level from voting member in demonstration of continued commitment to establishing and maintaining consensus-based standards available to all.
As a Principal Member of OGC, CAE will participate across OGC activities and serve in OGC’s Planning Committee to help OGC advance geospatial interoperability and open systems.
“It’s great to see CAE upgrading their OGC membership to Principal Level,” said OGC CEO Dr. Nadine Alameh. “CAE has been a champion of OGC since its leadership in making Common DataBase (CDB) an OGC Standard. CAE has helped show the world how internationally-recognized open standards are critical for creating and maintaining the simulation-based synthetic environments used in geospatial applications, training, mission rehearsal, and decision-support. Leadership such as shown by CAE has helped OGC accelerate its impact in modeling & simulation, digital twins, and has set the stage for the OGC Geo for Metaverse Domain Working Group.”
“Synthetic Environments and Digital Twins provide us with a comprehensive understanding of the world around us and improve our ability to manage, plan and decide,” said Joe Armstrong, Vice President of Synthetic Environments and Immersive Technologies at CAE. “These technologies rely on the fundamental components of valid and accessible data, highlighting how crucial it is to implement standards on how data can be shared and distributed. OGC embodies the best practices of standard setting and interoperability.”
OGC Principal Members participate in final approval decisions for all OGC standards and nominations to the Board of Directors, and ensure OGC’s policies and procedures remain effective and agile in a changing technological environment.
As an OGC Principal Member, CAE will work alongside other world-leading Principal Member organizations looking to advance the location information industry, including Airbus Defence & Space, Amazon Web Services, Defence Science & Technology Laboratories (Dstl), South African Department of Agriculture, Land Reform and Rural Development (DALRRD), Indian Department of Science and Technology, Esri, Feng Chia University, Saudi Arabian General Authority for Survey and Geospatial Information (GASGI), Google, Hexagon, Maxar, Microsoft, Oracle USA, Trimble, US Army Geospatial Center, and the US Census Bureau. A list of all current OGC Members is available at ogc.org/ogc/members.
About CAE
At CAE, we equip people in critical roles with the expertise and solutions to create a safer world. As a technology company, we digitalize the physical world, deploying simulation training and critical operations support solutions. Above all else, we empower pilots, airlines, defence and security forces, and healthcare practitioners to perform at their best every day and when the stakes are the highest. Around the globe, we’re everywhere customers need us to be with more than 13,000 employees in more than 200 sites and training locations in over 40 countries. CAE represents 75 years of industry firsts—the highest-fidelity flight, future mission, and medical simulators, and personalized training programs powered by artificial intelligence. We’re investing our time and resources into building the next generation of cutting-edge, digitally immersive training and critical operations solutions while keeping positive environmental, social and governance (ESG) impact at the core of our mission. Today and tomorrow, we’ll make sure our customers are ready for the moments that matter.
Visit cae.com for more information.About OGC
The Open Geospatial Consortium (OGC) is a collective problem-solving community of more than 550 experts representing industry, government, research and academia, collaborating to make geospatial (location) information and services FAIR – Findable, Accessible, Interoperable, and Reusable.
The global OGC Community engages in a mix of activities related to location-based technologies: developing consensus-based open standards and best-practices; collaborating on problem solving in agile innovation initiatives; participating in member meetings, events, and workshops; and more.
OGC’s unique standards development process moves at the pace of innovation, with constant input from technology forecasting, practical prototyping, real-world testing, and community engagement.
OGC bridges disparate sectors, domains, and technology trends, and encourages the cross-pollination of ideas between different communities of practice to improve decision-making at all levels. OGC is committed to creating an inclusive and sustainable future.
Visit ogc.org for more info on our work.The post CAE upgrades OGC Membership to Principal Level appeared first on Open Geospatial Consortium.
-
17:52
Ordnance Survey GB and Next Generation OGC API Standards
sur Open Geospatial Consortium (OGC)Contributed by: Michael Gordon, Senior Product Manager, Ordnance Survey
As the National Mapping Agency for Great Britain, Ordnance Survey (OS) works with a wide array of customers and partners across many industries and we know that our data, when combined with other datasets, often provides the solutions to many challenges. This means that our data, products, and services need to be interoperable, integration-ready and easy to use in a wide array of software. This is why we heavily use OGC Standards to publish our data and services and we have been a Strategic Member of the OGC for some time.
As the use of the internet in general and APIs in particular has exploded over the last 2 to 3 decades, we have placed a high priority on ensuring that developers can easily access authoritative geospatial data – especially mainstream developers who may not have the depth of geospatial knowledge of those directly in the geospatial industry.
At the same time, we were also aware that geospatial innovation would require changes to our large-scale authoritative data, remodelling and redesigning it to make it easier to use, allow faster, more agile development, and to make it more interoperable with modern software.
We wanted to pair this new, easier-to-use data with easy-to-use APIs, using mainstream technologies and encodings to access this next generation of data from OS. To achieve this we’ve been sponsoring OGC Collaborative Solutions and Innovation Program (formerly the OGC Innovation Program) work for several years – including previous Testbeds and multiple code sprints such as the 2022 Joint OGC OSGeo ASF Code Sprint; 2022 Joint OGC and ISO/TC 211 Metadata Code Sprint, and the 2022 OGC Web Mapping Code Sprint.
These all helped contribute – along with lots of hard work and contributions from other OGC members – to the publication of the OGC API – Features Standard along with the wider new generation of OGC API Standards.
We launched the OS NGD API – Features product at the end of September 2022, covering the core standard as well as the Coordinate Reference System (CRS) by Reference and Common Query Language (CQL) modules, providing access to our next generation of data along with powerful filtering capabilities to allow our customers to select just the data they need. Combining this powerful functionality with the new, richer, easier-to-understand data, as well as a daily update cycle, means our customers and partners can power their applications and innovate at pace with rich and consistent information about the real world to improve decision-making.
We’re currently building an OGC API – Tiles based product for launch in 2023 that will provide contextual basemap information to complement the OS NGD API – Features. We look forward to continuing to collaborate with OGC and its members to advance the OGC API Standards family and bring both ease-of-use and powerful functionality to authoritative geospatial data.
The OS NGD API – Features is available via the OS Data Hub, along with a variety of other OGC Standards-based products and services via API and download.
The post Ordnance Survey GB and Next Generation OGC API Standards appeared first on Open Geospatial Consortium.
-
15:27
Ordnance Survey GB and next generation OGC API Standards
sur Open Geospatial Consortium (OGC)Tags: OGC API, ogcapi, Ordnance SurveyContributed by: Michael Gordon, Senior Product Manager, Ordnance SurveyAs the National Mapping Agency for Great Britain, Ordnance Survey (OS) works with a wide array of customers and partners across many industries and we know that our data, when combined with other datasets, often provides the solutions to many challenges. This means that our data, products, and services need to be interoperable, integration-ready and easy to use in a wide array of software. This is why we heavily use OGC Standards to publish our data and services and we have been a Strategic Member of the OGC for some time.
As the use of the internet in general and APIs in particular has exploded over the last 2 to 3 decades, we have placed a high priority on ensuring that developers can easily access authoritative geospatial data - especially mainstream developers who may not have the depth of geospatial knowledge of those directly in the geospatial industry.
At the same time, we were also aware that geospatial innovation would require changes to our large-scale authoritative data, remodelling and redesigning it to make it easier to use, allow faster, more agile development, and to make it more interoperable with modern software.
We wanted to pair this new, easier-to-use data with easy-to-use APIs, using mainstream technologies and encodings to access this next generation of data from OS. To achieve this we’ve been sponsoring OGC Collaborative Solutions and Innovation Program (formerly the OGC Innovation Program) work for several years – including previous Testbeds and multiple code sprints such as the 2022 Joint OGC OSGeo ASF Code Sprint; 2022 Joint OGC and ISO/TC 211 Metadata Code Sprint, and the 2022 OGC Web Mapping Code Sprint.
These all helped contribute – along with lots of hard work and contributions from other OGC members – to the publication of the OGC API – Features Standard along with the wider new generation of OGC API Standards.
We launched the OS NGD API – Features product at the end of September 2022, covering the core standard as well as the Coordinate Reference System (CRS) by Reference and Common Query Language (CQL) modules, providing access to our next generation of data along with powerful filtering capabilities to allow our customers to select just the data they need. Combining this powerful functionality with the new, richer, easier-to-understand data, as well as a daily update cycle, means our customers and partners can power their applications and innovate at pace with rich and consistent information about the real world to improve decision-making.
We’re currently building an OGC API – Tiles based product for launch in 2023 that will provide contextual basemap information to complement the OS NGD API – Features. We look forward to continuing to collaborate with OGC and its members to advance the OGC API Standards family and bring both ease-of-use and powerful functionality to authoritative geospatial data.
The OS NGD API – Features is available via the OS Data Hub, along with a variety of other OGC Standards-based products and services via API and download.
-
15:00
OGC seeks Public Comment on v1.1 of OGC API – Environment Data Retrieval Standard
sur Open Geospatial Consortium (OGC)The Open Geospatial Consortium (OGC) seeks public comment on the candidate Version 1.1 of the OGC API – Environmental Data Retrieval (EDR) Standard. Comments are due by February 27, 2023.
The OGC API – EDR Standard provides a family of lightweight query interfaces to access spatiotemporal data resources by requesting data at a Position, within an Area, along a Trajectory, or through a Corridor. An API compliant with the OGC API – EDR Standard will return only the data needed by the user or client, reducing data transfer time and costs.
The OGC API – EDR Standard makes it easier to efficiently access a wide range of geospatial ‘big data’ through a uniform, well-defined, simple Web interface that shields the user from the complexities of data storage. An example use case of the Standard could be to retrieve, say, weather forecasts for a local area from a much larger national or global forecast dataset – though many other types of data can be accessed through the API.
By defining a small set of query patterns (and no requirement to implement all of them), OGC API – EDR helps to simplify the design and performance-tuning of systems, making it easier to build robust, scalable infrastructures.
Version 1.1 of the OGC API – EDR Standard brings the following improvements:
- The optional use of HTTP POST as well as the HTTP GET verb. This will allow longer queries, such as when specifying a complicated, detailed area with very many points. The query payload is also encrypted and therefore more secure.
- Custom dimensions are allowed. As well as the usual 4 dimensions of space and time, (x,y,z,t), other dimensions could be offered by a data collection, such as frequency. The dimensions could also be categorical: that is, not necessarily continuous like space and time. For example, a list of wave or frequency bands, or any enumerated type. A meteorological example would be to allow data retrieval from individual ensemble members.
To learn more about how the family of OGC API Standards work together to provide modular “building blocks for location” that address both simple and the most complex use-cases, visit ogcapi.org.
OGC Members interested in staying up to date on the progress of this standard, or contributing to its development, are encouraged to join the Standards Working Group via the OGC Portal.
The candidate OGC API – Environmental Data Retrieval v1.1 standard is available for review and comment on the OGC Portal. Comments are due by February 27, 2023, and should be submitted via the method outlined below.
To Comment:
Comments can be submitted to a dedicated email reflector for a thirty day period ending on the “Close request date” listed above, Comments received will be consolidated and reviewed by OGC members for incorporation into the document. Please submit your comments using the following link: Click here to submit comments.
Please refer to the following template for the message body: Comments Template.Subscribe to Comments:
You may wish to be added to the distribution list to receive comments as they are submitted. Subscribe to Distribution List. Subscribing to the list will also allow you to view comments already received, which can be found in the List Archives.About OGC
The Open Geospatial Consortium (OGC) is a collective problem-solving community of more than 550 experts representing industry, government, research and academia, collaborating to make geospatial (location) information and services FAIR – Findable, Accessible, Interoperable, and Reusable.
The global OGC Community engages in a mix of activities related to location-based technologies: developing consensus-based open standards and best-practices; collaborating on problem solving in agile innovation initiatives; participating in member meetings, events, and workshops; and more.
OGC’s unique standards development process moves at the pace of innovation, with constant input from technology forecasting, practical prototyping, real-world testing, and community engagement.
OGC bridges disparate sectors, domains, and technology trends, and encourages the cross-pollination of ideas between different communities of practice to improve decision-making at all levels. OGC is committed to creating an inclusive and sustainable future.
Visit ogc.org for more info on our work.The post OGC seeks Public Comment on v1.1 of OGC API – Environment Data Retrieval Standard appeared first on Open Geospatial Consortium.
-
15:00
OGC to form new GeoParquet Standards Working Group; Public Comment sought on Draft Charter
sur Open Geospatial Consortium (OGC)Contact: info@ogc.org
25 January 2023: The Open Geospatial Consortium (OGC) is in the process of forming a new GeoParquet Standards Working Group (SWG). Public comment is sought on its draft charter. Comments are due by 15 February, 2023.
The OGC GeoParquet SWG will work to advance the GeoParquet encoding format to an OGC Encoding Standard for cloud-native vector data. GeoParquet adds geospatial types to Apache Parquet, described by Apache as “an open source, column-oriented data file format designed for efficient data storage and retrieval. It provides efficient data compression and encoding schemes with enhanced performance to handle complex data in bulk.” For an introduction to the GeoParquet format, see this blog post.
GeoParquet started 3 years ago as a community effort by different Open Source Projects and organizations that have committed to its implementation and support.
OGC is advancing a number of Standards to enable cloud-native geospatial ecosystems. GeoParquet fits in the group of data encoding Standards that are highly performant for large, cloud-based data stores, such as Cloud Optimized GeoTIFF for tiled rasters and Zarr for datacubes. GeoParquet will, in time, enable vector datasets to be as readily accessible from the cloud as the other formats already well-used in the community.
The GeoParquet SWG will take the initial efforts incubated in OGC’s GeoParquet GitHub repository as a draft specification from which a candidate Standard will be developed. As with many other recent OGC Standards, the repository will remain open to contributions from outside OGC and documentation will evolve in concert with prototype implementations.
GeoParquet will be another encoding of the OGC Simple Features Standard, and as such will handle all Simple Feature geometries. While other OGC Standards also encode Simple Features, GeoParquet is intended to be optimized for native use in cloud environments. It is expected that GeoParquet will be tested as an encoding to be accessed by the OGC APIs. The proposed SWG expects to have a candidate Standard ready for review and approval within one year of creation of the SWG.
The draft charter for the GeoParquet Standards Working Group is available for review and comment on the OGC Portal. Comments are due by 15 February, 2023, and should be submitted via the method outlined on the GeoParquet SWG draft charter public comment request page.
About OGC
The Open Geospatial Consortium (OGC) is a collective problem-solving community of more than 550 experts representing industry, government, research and academia, collaborating to make geospatial (location) information and services FAIR – Findable, Accessible, Interoperable, and Reusable.
The global OGC Community engages in a mix of activities related to location-based technologies: developing consensus-based open standards and best-practices; collaborating on problem solving in agile innovation initiatives; participating in member meetings, events, and workshops; and more.
OGC’s unique standards development process moves at the pace of innovation, with constant input from technology forecasting, practical prototyping, real-world testing, and community engagement.
OGC bridges disparate sectors, domains, and technology trends, and encourages the cross-pollination of ideas between different communities of practice to improve decision-making at all levels. OGC is committed to creating an inclusive and sustainable future.
Visit ogc.org for more info on our work.The post OGC to form new GeoParquet Standards Working Group; Public Comment sought on Draft Charter appeared first on Open Geospatial Consortium.
-
16:19
OGC Announces new Geo for Metaverse Domain Working Group
sur Open Geospatial Consortium (OGC)The Open Geospatial Consortium (OGC) is excited to announce the formation of the OGC Geo For Metaverse Domain Working Group (DWG), which will serve as a forum for the collective geospatial expertise of the OGC community to gather to help build and grow the open Metaverse. The group is open to OGC Members and non-members alike.
The Metaverse is perhaps the ultimate distributed digital twin of the world. It has the potential to represent everything in the world alongside imagined spaces. The challenges to Standards Development Organizations (SDOs), technologists, artists, and society are huge, but the payoff is believed to be equally tremendous. The Metaverse is not a single thing but, like the internet, is a collection of platforms and technologies: a world of objects that can be navigated and interacted with.
Everything OGC does can be applied to the Metaverse. Our community can contribute expertise in 3D, Modeling & Simulation, Artificial Intelligence, Digital Twins, streaming, Augmented and Virtual Realities, routing, mapping, and more – all at scale.
The OGC Geo For Metaverse DWG will work on pieces of the Metaverse that pertain to geospatial applications and Standards by identifying standardization activities and best practices based on FAIR data principles (making data Findable, Accessible, Interoperable, and Reusable). Given that the Metaverse will be an evolutionary development, the working group will identify both near- and long-term goals that will help ensure interoperability, FAIRness, and openness. Much of the Metaverse is already happening, so collaboration with other partners will be key to its success and will be a grounding principle of this OGC DWG. Specifically, the DWG will be the primary point of contact with the Metaverse Standards Forum, of which OGC is a founding and Principal member.
3D geospatially anchored data is powering a revolution across a range of industries. This same data – currently relied upon for construction of the real world – is now driving the creation of virtual/digital worlds that will form parts of the Metaverse.
For the Metaverse to succeed, however, all digital and physical world information will have to work in concert at scale. We have a collective responsibility to ensure that the shared future is FAIR and Open. OGC has always focused on interoperability and open Standards – both of which are key to ensuring an open Metaverse. Working together, we can have a positive impact on this future.
Join OGC for a conversation about how our standards and innovations contribute to the metaverse – from real-time 3D, to digital twins, to Augmented Reality, GeoPose, and more. OGC’s 125th Member Meeting (February 20-24, 2023, virtual and in-person at ESA’s ESRIN in Frascati, Italy) will include the inaugural meeting of the OGC Geo For Metaverse DWG. Learn more and register at meet.ogc.org.
For a primer on the Metaverse and the role that OGC and the Geospatial Industry can play, see the blog post ‘The Metaverse is Geospatial’ on ogc.org.For a primer on the Metaverse and the role that OGC and the Geospatial Industry can play, see the blog post ‘The Metaverse is Geospatial’ on ogc.org.
OGC members and non-members alike who are actively working within the Metaverse and related communities are invited to participate in the Geo For Metaverse Domain Working Group. OGC Members can join the OGC Geo For Metaverse Domain Working Group using the following Portal Page. OGC DWG’s are open to the public. Non-OGC Members can join the OGC Geo For Metaverse DWG by signing up to the mailing list here.
Learn more about the Mission, Goals, and Planned Activities of the DWG on the OGC Geo For Metaverse DWG homepage.
About OGC
The Open Geospatial Consortium (OGC) is a collective problem-solving community of more than 550 experts representing industry, government, research and academia, collaborating to make geospatial (location) information and services FAIR – Findable, Accessible, Interoperable, and Reusable.
The global OGC Community engages in a mix of activities related to location-based technologies: developing consensus-based open standards and best-practices; collaborating on problem solving in agile innovation initiatives; participating in member meetings, events, and workshops; and more.
OGC’s unique standards development process moves at the pace of innovation, with constant input from technology forecasting, practical prototyping, real-world testing, and community engagement.
OGC bridges disparate sectors, domains, and technology trends, and encourages the cross-pollination of ideas between different communities of practice to improve decision-making at all levels. OGC is committed to creating an inclusive and sustainable future.
Visit ogc.org for more info on our work.The post OGC Announces new Geo for Metaverse Domain Working Group appeared first on Open Geospatial Consortium.
-
16:10
OGC Calls For Participation in its next Pilot to improve Disaster Management and Response
sur Open Geospatial Consortium (OGC)The Open Geospatial Consortium (OGC) is Calling For Participants in the OGC Disaster Pilot 2023. Funding is available. Responses are due by February 17, 2023. A Q&A Session will be held 10AM, US Eastern, on February 1.
The goal of the Disaster Pilot 2023 is to further improve the ability of key decision makers and responders to discover, manage, access, qualify, share, and exploit location-based information in support of disaster preparedness & response as well as the full cycle of multi-hazard disaster management.
The OGC Disaster Pilot 2021 and 2022 initiatives included more than 50 individuals as participants and sponsors, and more than 150 stakeholder representatives, who collaboratively produced User and Provider Readiness Guides, multiple Engineering Reports, and a large number of workshops, webinars, and other outreach events. These Disaster Readiness Guides focused on flooding and landslide events, and now the 2023 Pilot seeks to update them to cover wildfires and drought.
The Pilot provides a unique opportunity for Participants to work jointly with the full range of disaster stakeholders – from EO data providers and relief organizations to field responders – towards the goal of applying standards-based geospatial IT solutions to real problems of marshaling coordinated, effective responses to complex disaster scenarios. Key technologies that will support this goal include Artificial Intelligence (GeoAI), Analysis Ready Data (ARD), Routing, APIs, Cloud Computing, Decision Ready Indicators (DRI) and more.
The outcomes are expected to shape the future of disaster management ecosystems through user-centric interoperability arrangements, identification of critical data sharing challenges, and the delivery of cloud computing scale and agility to field personnel and relief organizations when and where they need it. Pilot sponsorship supports this vision with cost-sharing funds to largely offset the costs associated with development, engineering, and demonstration of these outcomes. This offers selected Participants a unique opportunity to recoup a significant portion of their initiative expenses.
Disaster management efforts often fall short because collaborative workflows are not put into place until disaster has already struck, by which time it is too late to achieve fully informed decision making. The spaced-based observations, scientific insight, and cloud computing capabilities are there, but their benefits are not reaching the right people in the right form, at the right time and for the places where they are needed.
With the OGC Disaster Pilot 2023, the disaster management stakeholder community, together with the wider public, have the opportunity to bridge the divide between data and decisions. By bringing together the spatial data infrastructure (SDI), data sharing, and collaboration puzzle pieces that connect people and organizations – from the data providers all the way to and from first responders, decision makers, and everyone in between – the Pilot will help form a coordinated information ecosystem that is ready to adapt to any disaster, any region, any combination of data sources and tools, any expert insight, and any dire need.
OGC and its industry, government, and academic members make it easier and faster to fit those puzzle pieces to a functional and agile pattern. OGC Initiatives such as these Disaster Pilots enable the complex combinations of technological, architectural, standardization, operational requirements, and community perspectives to be rigorously prototyped, flight-tested, shared, and documented.
The OGC Disaster Pilot 2023 will be conducted under OGC’s Collaborative Solutions and Innovation (COSI) Program (formerly the Innovation Program), a collaborative, agile, and hands-on prototyping and engineering environment where sponsors and OGC members come together to address location interoperability challenges while validating international open standards. To learn about the benefits of sponsoring an OGC COSI Program Initiative such as this, visit the OGC COSI Program webpage, or watch this short video on how OGC’s COSI Program can benefit your organization.
Watch this video for an overview of some of OGC’s work in Disasters and Climate Change.
OGC is grateful for the industry leadership and investment of OGC Strategic Members NRCan, NASA, USGS, and FGDC as displayed by their sponsorship of this OGC COSI Program Initiative.
If you are interested in participating in this critical opportunity to shape collaborative disaster management capabilities, visit the Disaster Pilot 2023 page on ogc.org to download the Call For Participation or learn how to attend the Q&A Session on February 1. Responses are due by February 17, 2023.
About OGC
The Open Geospatial Consortium (OGC) is a collective problem-solving community of more than 550 experts representing industry, government, research and academia, collaborating to make geospatial (location) information and services FAIR – Findable, Accessible, Interoperable, and Reusable.
The global OGC Community engages in a mix of activities related to location-based technologies: developing consensus-based open standards and best-practices; collaborating on problem solving in agile innovation initiatives; participating in member meetings, events, and workshops; and more.
OGC’s unique standards development process moves at the pace of innovation, with constant input from technology forecasting, practical prototyping, real-world testing, and community engagement.
OGC bridges disparate sectors, domains, and technology trends, and encourages the cross-pollination of ideas between different communities of practice to improve decision-making at all levels. OGC is committed to creating an inclusive and sustainable future.
Visit ogc.org for more info on our work.The post OGC Calls For Participation in its next Pilot to improve Disaster Management and Response appeared first on Open Geospatial Consortium.
-
18:40
Principled and Powerful: Geospatial data for ESG reporting at Location Powers 2022
sur Open Geospatial Consortium (OGC)There was a principled and powerful representation in attendance at OGC’s Location Powers event in London last November. Ordnance Survey hosted the event at the Geovation Hub with 23 speakers over two days. Representatives from across finance, transport/logistics, and construction/maintenance markets described their organizations’ approach to Environmental, Social, and (corporate) Governance (ESG) reporting and how it benefits from geospatial technologies. The event highlighted the need for sharing location information easily to many stakeholders.
“This Location Powers event has been eye-opening in understanding how financial experts and geospatial practitioners can look at the same data in very different ways,” said Scott Simmons, OGC’s Chief Standards Officer. “Working from the perspective of each will help us identify the right targets for standardization and development of best practices in the use of geospatial data to support accurate and comparable ESG metrics.”
While nation states are working towards their United Nations (UN) Sustainable Development Goals for 2030, corporations are being incentivised to work across international borders and throughout their supply chain in an effort to reduce carbon emissions and mitigate and adapt to climate change risks. Self-regulation has not been effective and new EU legislation, due in June 2023, could change the economic landscape not just in Europe, but globally. Geospatial’s role will be crucial in this effort.
Allan Jamieson, Data Standards and Governance, Ordnance Survey (OS) opens Location Powers 2022.
A corporation’s voluntary self-assessment is the current source of most ESG reports, as stand-alone documents or as provided with their annual report. The information being supplied can be variable, to differing standards, overly-optimistic, or even take a cynical approach that ‘green-washes’ their actions. For international corporations, being able to reassure stakeholders and investors that their business model can extend into the future requires more diverse data to be included in ESG reporting.
For each of the data topics – environment, social, and (corporate) governance – there is a location aspect. Environmental data can be derived from Earth Observation (EO) and remote sensing sources. Social data can be sourced from census, social media, and on-the-ground news reports. And both the Environmental and Social aspects are directly reflected in governance. What regulatory authorities and other organizations are searching for is authoritative, consistent, and reliable data from which they can verify their own datasets against the validity of corporations and their ESG reports.
Read on to learn what was discussed, or access recordings and slides from the event on the OGC Location Powers 2022 website.
Two Days of InsightsAttendees meet during networking reception at Location Powers event
The Location Powers provided attendees with two days of focused content: Day 1 covered the challenges of the various industry sectors and Day 2 highlighted the geospatial sector’s response to those challenges. Each day included expert presentations with Q&A, breakout sessions on various sector topics, and an end-of-day expert panel to share outcomes and insights from the breakouts. There were plenty of opportunities to network with other attendees and speakers during the breaks, over an enjoyable lunch, and an evening drinks reception.
Day OneAfter introductions, Richard Peers, Founder, ResponsibleRisk kicked off the event with a detailed overview of ESG and the various standards available in the context of COP27 including the Task Force on Climate-Related Financial Disclosures (TCFD) that serves to mitigate, not just deal with, outcomes, and the Transition Framework for Nature (TNFD) to describe the governance and strategy.
Yasmin Raza, ESG Market Intelligence & Engagement Team Manager, Financial Conduct Authority (FCA) spoke specifically about how the finance sector in the UK plans to move to become the NetZero world capital of finance. The FCA will introduce three key labels to help consumers distinguish between financial products: Sustainable Focus, Sustainable Improvers, Sustainable Impact.
David J. Patterson, Head of Conservation Intelligence, World Wide Fund for Nature (WWF-UK) highlighted the difficulties in accessing up-to-date data on biodiversity at a suitable resolution for analysis when accuracy of data sets was hard to determine.
Andrew Coote, Director, ConsultingWhere described the implementation of the UN Integrated Geospatial Information Framework (IGIF) aimed at developing nations and how the UN was supporting them to move from country-level guidance to strategic localized action plans.
Location Powers 2022 has begun! OGC Chief Standards Officer Scott Simmons is opening with an overview of OGC and geospatial in #ESG reporting #OGCLP22 pic.twitter.com/bIvUkQY6YX
— Open Geospatial: OGC (@opengeospatial) November 15, 2022Franca Wolf, Senior Analyst, Verisk Maplecroft expanded on this theme further by talking about the geospatial risk exposure to climate change for investors. Corporations want to be able to forecast ahead and anticipate their ESG rating to mitigate their exposure to future risks.
Scott James, Partner, Ward Williams Associates (WWA) brought us back to the ESG reporting reality for most businesses: how do you find the data you are looking for and how can you document the good that you are doing? He described the WWA journey through the lens of becoming B-Corp Certified and the overall benefit of attracting new talent and customers.
Daniel Barlow, Innovation Policy at British Standards Institution (BSI) presented remotely as he dialed in from COP27 to relay the new ISO NetZero Guidelines for industry.
A panel discussion on “Future trends in ESG business models from banking, logistics, and construction” included: Michael Groves, CEO, Topolytics; Mariam Crichton, CEO, 7 Satya; Jen Dixon, Business Analyst & Ethics Advisory Group, Esri UK; and was hosted by Donna Lyndsay, Strategic Market Lead – Environment and Sustainability, Ordnance Survey.
The panelists raised topics on ethics in data collection, human rights, geospatial data not being instant, and the cost and reduction of energy consumption. The most important discussion was regarding the question: “does the C-Suite even care about ESGs? And how do we get them to care, especially when this may highlight uncomfortable human rights practices in the supply chain.”
Conversation went on to discuss how and when to learn from ethicists, and even ventured into the role that western and eastern philosophies could play. The role of ESG reporting has been voluntary so far, but international corporations are soon to be brought under regulatory security, which will require a change in mindset to adapt to new market requirements.
(L-R) Donna Lyndsay hosts the Day 1 panel consisting of Michael Groves, Mariam Crichton and Jen Dixon at OGC Location Powers 2022. Day Two
The first speaker for the second day was Ed Parsons, Google’s Geographer. Ed talked about how the geospatial sector is responding to the sheer volume of geospatial data being collected that needs to be evaluated and audited. Google Maps now offers route options based on releasing fewer GHG emissions and better fuel efficiency. Google Environmental Insights Explorer offers Machine Learning (ML) capabilities to city and regional authorities.
Simon Casey, Channel Sales Manager, Satellite Vu brought to our attention the new developments in Thermal Infrared (TIR) sensors in terms of resolution and high (10-20) revisits per day, and how it has enabled the detection of small zones releasing excessive thermal energy, such as vessels or wildfires.
Scott Simmons, Chief Standards Officer, OGC, compared and contrasted the top ESG rating agencies and their ESG indicator definitions. He engaged the participants to discuss what data that contributes to ESG ratings could or should be standardized and which is outside the realm of geospatial standards.
Mattie Yeta, Chief Sustainability Officer, CGI assessed the ESG commitments of her organization to develop their own system of ESG reporting, including data centers, travel, and natural capital. CGI is working on a Sustainability Exploration Environmental Data Science (SEEDS) with the UN.
Ali Nicholl, Founder & Engagement, IOTICS raised the big ESG data challenge that needs to be resolved by collaboration and cooperation through digital ecosystems.
Will Cadell, CEO, SparkGeo and Oliver Morris, Account Manager, Tensing, both led a breakout session on Mixing company fixed asset data with geospatial data at OGC Location Powers yesterday. #Climate #OGCLP22 #ESG #SDG #ESGs #SDGs #GISday @opengeospatial @OrdnanceSurvey pic.twitter.com/ouaQmxOvlO
— Caroline Robinson (@OGC_CarolineR) November 17, 2022Allan Jamieson, Data Standards and Governance, Ordnance Survey (OS) highlighted the importance of authoritative data, and building trust for the validation of ESG reporting.
Olive Powell, Head of Geography & Geospatial and Charlie Dacke, Head of Geospatial Technology & Standards, Office for National Statistics (ONS) presented the challenges of providing statistics for the public good while maintaining the privacy of individuals by using the Reference Data Management Framework (RDMF).
Andrea Santiago, Subdirector, National Institute of Statistics and Geography of Mexico (INEGI), joined us remotely to present on using the Locus Charter to balance privacy and analytic power.
Group discussion from breakout sessions and closing remarks hosted by Scott Simmons focused on next steps. The panel consisted of: David Philp, Director – Digital Consulting, Strategy & Innovation – Europe, Digital AECOM; Marzia Bolpagni, Head of BIM International – Associate Director, MACE; Oliver Morris, Account Manager, Tensing; and Ian Prentice, Business Development Manager, Carto.
The panelists noted that, when it comes to customers finding and choosing datasets or processing models, there is a competitive advantage for geospatial businesses in making a range of tasks self-service. The Geospatial community has the technology and data, but needs to convey the trustworthiness of these data and services by making methodology more transparent. Fortunately, there is an increasing level of trust in publicly-released datasets and consideration should be made about use of the Gemini Principles as a good way to build trust.
As Allan Jamieson, Data Standards and Governance, Ordnance Survey (OS), said during the event: “By providing authoritative geospatial data, trust will be built in the validation of ESG reporting.”
(L-R) Scott Simmons hosts the Day 2 panel consisting of David Philp, Marzia Bolpagni, Oliver Morris, and Ian Prentice.
All slides and recordings from OGC Location Powers on ESG Reporting are publicly available on the OGC Location Powers 2022 website. The full results of the OGC Location Powers event will be released as a white paper in Q1 2023.
Upcoming eventsIN-PERSON MEETING
April 17 – April 18, 2023
INTERPRAEVENT 2023 International Symposium INTERPRAEVENT 2023 International Symposium will be held in 17-18 April 2023 in Taichung,Taiwan. The main topic of the symposium “Natural Disasters Occurrence, Reduction, and Restoration in Mountain Regions” is focused on natural hazards from the aspects of investigating the phenomena, monitoring occurrences, analyzing risks, strengthening governance, increasing resilience, mitigating disasters, restoring impacted communities and enhancing preparedness.Community event
HYBRID MEETING
April 23 – April 28, 2023
EGU General Assembly 2023 The EGU General Assembly 2023 brings together geoscientists from all over the world to one meeting covering all disciplines of the Earth, planetary, and space sciences. The EGU aims to provide a forum where scientists, especially early career researchers, can present their work and discuss their ideas with experts in all fields of geoscience.Community event
SPRINT (HYBRID)
April 25 – April 27, 2023
Open Standards and Open Source Software 2023 Joint OGC-ASF-OSGeo Code A Code Sprint is an event where dozens of developers from around the world come together to code and share their ideas. The main goals of this code sprint are to support the development of open standards for geospatial information and to support the development of free and open source software which implements those standards, as well as creating awareness about the standards and software projects. This code sprint will take place at Camptocamp\’s offices (near Lausanne, Switzerland) and on the OGC\’s Discord events server. It is organised by the Open Geospatial Consortium (OGC), the Open Source Geospatial Foundation (OSGeo) and the Apache Software Foundation (ASF) and it is hosted by Camptocamp. You can find more details about the code sprint on the event\’s wiki page .OGC event
The post Principled and Powerful: Geospatial data for ESG reporting at Location Powers 2022 appeared first on Open Geospatial Consortium.
-
17:23
Principled and Powerful: Geospatial data for ESG reporting at Location Powers 2022
sur Open Geospatial Consortium (OGC)Tags: location powers, ESGThere was a principled and powerful representation in attendance at OGC’s Location Powers event in London last November. Ordnance Survey hosted the event at the Geovation Hub with 23 speakers over two days. Representatives from across finance, transport/logistics, and construction/maintenance markets described their organizations’ approach to Environmental, Social, and (corporate) Governance (ESG) reporting and how it benefits from geospatial technologies. The event highlighted the need for sharing location information easily to many stakeholders.
“This Location Powers event has been eye-opening in understanding how financial experts and geospatial practitioners can look at the same data in very different ways,” said Scott Simmons, OGC’s Chief Standards Officer. “Working from the perspective of each will help us identify the right targets for standardization and development of best practices in the use of geospatial data to support accurate and comparable ESG metrics.”
While nation states are working towards their United Nations (UN) Sustainable Development Goals for 2030, corporations are being incentivised to work across international borders and throughout their supply chain in an effort to reduce carbon emissions and mitigate and adapt to climate change risks. Self-regulation has not been effective and new EU legislation, due in June 2023, could change the economic landscape not just in Europe, but globally. Geospatial’s role will be crucial in this effort.
Allan Jamieson, Data Standards and Governance, Ordnance Survey (OS) opens Location Powers 2022.A corporation’s voluntary self-assessment is the current source of most ESG reports, as stand-alone documents or as provided with their annual report. The information being supplied can be variable, to differing standards, overly-optimistic, or even take a cynical approach that ‘green-washes’ their actions. For international corporations, being able to reassure stakeholders and investors that their business model can extend into the future requires more diverse data to be included in ESG reporting.
For each of the data topics – environment, social, and (corporate) governance – there is a location aspect. Environmental data can be derived from Earth Observation (EO) and remote sensing sources. Social data can be sourced from census, social media, and on-the-ground news reports. And both the Environmental and Social aspects are directly reflected in governance. What regulatory authorities and other organizations are searching for is authoritative, consistent, and reliable data from which they can verify their own datasets against the validity of corporations and their ESG reports.
Read on to learn what was discussed, or access recordings and slides from the event on the OGC Location Powers 2022 website.
Two Days of InsightsThe Location Powers provided attendees with two days of focused content: Day 1 covered the challenges of the various industry sectors and Day 2 highlighted the geospatial sector’s response to those challenges. Each day included expert presentations with Q&A, breakout sessions on various sector topics, and an end-of-day expert panel to share outcomes and insights from the breakouts. There were plenty of opportunities to network with other attendees and speakers during the breaks, over an enjoyable lunch, and an evening drinks reception.
Day OneAfter introductions, Richard Peers, Founder, ResponsibleRisk kicked off the event with a detailed overview of ESG and the various standards available in the context of COP27 including the Task Force on Climate-Related Financial Disclosures (TCFD) that serves to mitigate, not just deal with, outcomes, and the Transition Framework for Nature (TNFD) to describe the governance and strategy.
Yasmin Raza, ESG Market Intelligence & Engagement Team Manager, Financial Conduct Authority (FCA) spoke specifically about how the finance sector in the UK plans to move to become the NetZero world capital of finance. The FCA will introduce three key labels to help consumers distinguish between financial products: Sustainable Focus, Sustainable Improvers, Sustainable Impact.
David J. Patterson, Head of Conservation Intelligence, World Wide Fund for Nature (WWF-UK) highlighted the difficulties in accessing up-to-date data on biodiversity at a suitable resolution for analysis when accuracy of data sets was hard to determine.
Andrew Coote, Director, ConsultingWhere described the implementation of the UN Integrated Geospatial Information Framework (IGIF) aimed at developing nations and how the UN was supporting them to move from country-level guidance to strategic localized action plans.
Franca Wolf, Senior Analyst, Verisk Maplecroft expanded on this theme further by talking about the geospatial risk exposure to climate change for investors. Corporations want to be able to forecast ahead and anticipate their ESG rating to mitigate their exposure to future risks.
Scott James, Partner, Ward Williams Associates (WWA) brought us back to the ESG reporting reality for most businesses: how do you find the data you are looking for and how can you document the good that you are doing? He described the WWA journey through the lens of becoming B-Corp Certified and the overall benefit of attracting new talent and customers.
Daniel Barlow, Innovation Policy at British Standards Institution (BSI) presented remotely as he dialed in from COP27 to relay the new ISO NetZero Guidelines for industry.
A panel discussion on “Future trends in ESG business models from banking, logistics, and construction” included: Michael Groves, CEO, Topolytics; Mariam Crichton, CEO, 7 Satya; Jen Dixon, Business Analyst & Ethics Advisory Group, Esri UK; and was hosted by Donna Lyndsay, Strategic Market Lead – Environment and Sustainability, Ordnance Survey.
The panelists raised topics on ethics in data collection, human rights, geospatial data not being instant, and the cost and reduction of energy consumption. The most important discussion was regarding the question: “does the C-Suite even care about ESGs? And how do we get them to care, especially when this may highlight uncomfortable human rights practices in the supply chain.”
Conversation went on to discuss how and when to learn from ethicists, and even ventured into the role that western and eastern philosophies could play. The role of ESG reporting has been voluntary so far, but international corporations are soon to be brought under regulatory security, which will require a change in mindset to adapt to new market requirements.
Day Two
(L-R) Donna Lyndsay hosts the Day 1 panel consisting of Michael Groves, Mariam Crichton and Jen Dixon at OGC Location Powers 2022.The first speaker for the second day was Ed Parsons, Google’s Geographer. Ed talked about how the geospatial sector is responding to the sheer volume of geospatial data being collected that needs to be evaluated and audited. Google Maps now offers route options based on releasing fewer GHG emissions and better fuel efficiency. Google Environmental Insights Explorer offers Machine Learning (ML) capabilities to city and regional authorities.
Simon Casey, Channel Sales Manager, Satellite Vu brought to our attention the new developments in Thermal Infrared (TIR) sensors in terms of resolution and high (10-20) revisits per day, and how it has enabled the detection of small zones releasing excessive thermal energy, such as vessels or wildfires.
Scott Simmons, Chief Standards Officer, OGC, compared and contrasted the top ESG rating agencies and their ESG indicator definitions. He engaged the participants to discuss what data that contributes to ESG ratings could or should be standardized and which is outside the realm of geospatial standards.
Mattie Yeta, Chief Sustainability Officer, CGI assessed the ESG commitments of her organization to develop their own system of ESG reporting, including data centers, travel, and natural capital. CGI is working on a Sustainability Exploration Environmental Data Science (SEEDS) with the UN.
Ali Nicholl, Founder & Engagement, IOTICS raised the big ESG data challenge that needs to be resolved by collaboration and cooperation through digital ecosystems.
Allan Jamieson, Data Standards and Governance, Ordnance Survey (OS) highlighted the importance of authoritative data, and building trust for the validation of ESG reporting.
Olive Powell, Head of Geography & Geospatial and Charlie Dacke, Head of Geospatial Technology & Standards, Office for National Statistics (ONS) presented the challenges of providing statistics for the public good while maintaining the privacy of individuals by using the Reference Data Management Framework (RDMF).
Andrea Santiago, Subdirector, National Institute of Statistics and Geography of Mexico (INEGI), joined us remotely to present on using the Locus Charter to balance privacy and analytic power.
Group discussion from breakout sessions and closing remarks hosted by Scott Simmons focused on next steps. The panel consisted of: David Philp, Director - Digital Consulting, Strategy & Innovation - Europe, Digital AECOM; Marzia Bolpagni, Head of BIM International - Associate Director, MACE; Oliver Morris, Account Manager, Tensing; and Ian Prentice, Business Development Manager, Carto.
The panelists noted that, when it comes to customers finding and choosing datasets or processing models, there is a competitive advantage for geospatial businesses in making a range of tasks self-service. The Geospatial community has the technology and data, but needs to convey the trustworthiness of these data and services by making methodology more transparent. Fortunately, there is an increasing level of trust in publicly-released datasets and consideration should be made about use of the Gemini Principles as a good way to build trust.
As Allan Jamieson, Data Standards and Governance, Ordnance Survey (OS), said during the event: “By providing authoritative geospatial data, trust will be built in the validation of ESG reporting.”
(L-R) Scott Simmons hosts the Day 2 panel consisting of David Philp, Marzia Bolpagni, Oliver Morris, and Ian Prentice.All slides and recordings from OGC Location Powers on ESG Reporting are publicly available on the OGC Location Powers 2022 website. The full results of the OGC Location Powers event will be released as a white paper in Q1 2023.
-
17:51
OGC Compliance Certification Available for v1.0 of the OGC API – Environmental Data Retrieval Standard
sur Open Geospatial Consortium (OGC)The Open Geospatial Consortium (OGC) is excited to announce that the Executable Test Suite for version 1.0 of the OGC API – Environmental Data Retrieval Standard has been approved by the OGC Membership. Products that implement the Standard and pass the tests in the ETS can now be certified as OGC Compliant.
Implementers of the Standard are invited to validate their products using the new test suite, available on the OGC validator tool.
The OGC API – EDR Standard provides a family of lightweight query interfaces to access spatio-temporal data resources by requesting data at a Position, within an Area, along a Trajectory, or through a Corridor. The API then returns only the data needed by the user or client, reducing data transfer time and costs.
The OGC API – EDR Standard makes it easier to access a wide range of geospatial data through a uniform, well-defined, simple Web interface that shields the user from the complexities of data storage. An example use case of the Standard could be to retrieve, say, weather forecasts for a local area from a much larger national or global forecast dataset – though many other types of data can be accessed through the API.
Compliance Testing for OGC API – EDR involves submitting the web address of an Application Programming Interface (API) that implements the Standard. These tests typically take only 5-10 minutes to complete. Once a product has passed the test, the implementer can submit an application to OGC for use of the Certified OGC Compliant trademark on the product and associated marketing materials.
To support developers of products that implement the Standard, version 0.9.0 of the pygeoapi product has been designated a reference implementation of the Standard after the software successfully passed the compliance tests.
The OGC Compliance Program is a certification process that ensures organizations’ solutions are compliant with OGC Standards. It is a universal credential that allows agencies, industry, and academia to better integrate their solutions. OGC compliance provides confidence that a product will seamlessly integrate with other compliant solutions regardless of the vendor that created them.
To learn more about how the family of OGC API Standards work together to provide modular “building blocks for location” that address both simple and the most complex use-cases, visit ogcapi.org.
More information about the OGC compliance process is available at ogc.org/compliance. Various developer resources are available on the OGC API website to support implementers of OGC API – EDR Standard. Implementers of the Standard can validate their products using the OGC validator tool.
About OGC
The Open Geospatial Consortium (OGC) is a collective problem-solving community of more than 550 experts representing industry, government, research and academia, collaborating to make geospatial (location) information and services FAIR – Findable, Accessible, Interoperable, and Reusable.
The global OGC Community engages in a mix of activities related to location-based technologies: developing consensus-based open standards and best-practices; collaborating on problem solving in agile innovation initiatives; participating in member meetings, events, and workshops; and more.
OGC’s unique standards development process moves at the pace of innovation, with constant input from technology forecasting, practical prototyping, real-world testing, and community engagement.
OGC bridges disparate sectors, domains, and technology trends, and encourages the cross-pollination of ideas between different communities of practice to improve decision-making at all levels. OGC is committed to creating an inclusive and sustainable future.
Visit ogc.org for more info on our work.The post OGC Compliance Certification Available for v1.0 of the OGC API – Environmental Data Retrieval Standard appeared first on Open Geospatial Consortium.
-
18:00
Building the Building Blocks for the Future of Location: The November 2022 OGC Web Mapping Code Sprint
sur Open Geospatial Consortium (OGC)The mechanisms through which maps are delivered across the Internet have evolved significantly over the past two decades. Advancement of such mechanisms has been driven by a combination of factors. New data formats have emerged, the SWaP-C (size, weight, power, and cost) of devices has improved, and the capabilities of web browsers have been enhanced by improvements brought by HTML5. This means that some of the functionality that web mapping applications could not implement in a standardized way, are now becoming increasingly common.
To support the development of OGC API Standards, the building blocks for location that standardize many of the new capabilities available to web mapping applications, the Open Geospatial Consortium (OGC) and EuroGeographics hosted the 2022 Web Mapping Code Sprint from November 29th to December 1st, 2022. The event was sponsored by OGC Strategic Member, Ordnance Survey, and was held as a hybrid event, consisting of a virtual element hosted on the OGC’s Discord environment alongside an in-person element hosted by EuroGeographics in Brussels, Belgium.
Code Sprints experiment with emerging ideas in the context of geospatial Standards, help improve interoperability of existing Standards by experimenting with new extensions or profiles, and are used for building proofs-of-concept to support standards-development activities and the enhancement of software products. Non-coding activities such as testing, working on documentation, or reporting issues are also conducted during a code sprint. In addition, the code sprints’ mentor stream provides an excellent opportunity to onboard developers new to the Standards.
The 2022 Web Mapping Code Sprint focused on the following:
- OGC API – Tiles Standard: This Standard describes API building blocks that can enable implementations to serve map tiles, vector tiles (tiled feature data) or tiled coverage data.
- OGC API – Maps candidate Standard: This candidate Standard describes API building blocks that can enable implementations to serve spatially referenced and dynamically rendered electronic maps.
- OGC API – Styles candidate Standard: This candidate Standard describes API building blocks that can enable implementations to manage and fetch styles that consist of symbolizing instructions that can be applied by a rendering engine to features and/or coverages.
- OtherStyles & Symbology Encodings (e.g., SLD, SymCore, etc.)
The mentor stream of the code sprint featured two tutorials, about understanding and using one server side and one client side implementation of OGC API – Tiles. It also included two onboarding sessions, focused on collaborating in software projects that implement the standards.
The code sprint successfully facilitated the development and testing of prototype implementations of OGC API Standards, including candidate Standards, that relate to web mapping. Further, the code sprint provided a foundation for the development of the next version of the Symbology Core Standard. Participants were able to provide feedback directly to the editors of the Standards and the editors were able to clarify any issues encountered by the sprint participants and the sprint also raised awareness about the Standards. The code sprint therefore met all of its objectives.
OGC is an international consortium of more than 500 businesses, government agencies, research organizations, and universities driven to make geospatial (location) information and services FAIR?—?Findable, Accessible, Interoperable, and Reusable. The consortium consists of Standards Working Groups (SWGs) that have responsibility for designing a candidate Standard prior to approval as an OGC Standard and for making revisions to an existing OGC Standard. The sprint objectives for the SWGs were to:
- Create awareness about OGC Standards;
- Develop prototype implementations of OGC Standards, including implementations of draft OGC Application Programming Interface (API) Standards;
- Test the prototype implementations;
- Provide feedback to the Editor about what worked and what did not; and
- Provide feedback about the Standards and candidate Standards.
EuroGeographics is a not-for-profit organization that represents many of the National Mapping, Cadastral and Land Registration Authorities across Europe. The organization facilitates access to data, services, and expertise, as well as supporting the sharing of knowledge across the continent. The organization also publishes a product called Open Maps for Europe, which provided a useful resource for sprint participants. For example, within the first day of the code sprint, the sprint participants had implemented an OGC API -Maps façade in front of a Web Map Service (WMS) that was serving maps from the Open Maps for Europe product.
Ordnance Survey (OS) is the National Mapping Agency of Great Britain. OS publishes printed and digital maps, as well as offering access to the maps and data through a variety of APIs. In September 2022, OS launched the OS NGD API suite of products that implement a number of OGC API Standards. The Web Mapping Code Sprint therefore provided an opportunity for OS to directly support the advancement and implementation of the OGC API Standards on which the new OS NGD API products are built. The code sprint also provided an opportunity for OS engineers to directly engage with the editors of the Standards. Such access to editors and SWG members greatly accelerates development of applications.
Several more OGC Code Sprints are planned for the year 2023. To keep up to date with the latest plans, please visit [https:]]
The post Building the Building Blocks for the Future of Location: The November 2022 OGC Web Mapping Code Sprint appeared first on Open Geospatial Consortium.
-
16:42
OGC Seeks Public Comment on adoption of new version of CityJSON as Community Standard
sur Open Geospatial Consortium (OGC)OGC Seeks Public Comment on adoption of new version of CityJSON as Community Standard
CityJSON is a web-friendly encoding of the CityGML data model, which is used in Digital Twins and other uses pertaining to built and natural environments. Public comment period ends 1, February 2023.
21 December 2022: The Open Geospatial Consortium (OGC) seeks public comment on an updated version (v1.1) of the CityJSON Community Standard. Comments are due 1 February, 2023.
The CityJSON v1.1 Community Standard Justification Document outlining the many changes is available for review and comment on the OGC Portal. The complete CityJSON v1.1 specifications are available on the CityJSON website.
CityJSON defines ways to describe, in a JSON encoding, most of the common 3D features and objects found in cities (such as buildings, roads, rivers, bridges, vegetation, and city furniture) and the relationships between them. It also defines how to encode different standard levels of detail (LoDs) for the 3D objects in JSON, which allows users to represent different resolutions of objects for different applications and purposes. A list of applications and use-cases for CityJSON is available on the CityJSON website.
The aim of CityJSON is to offer an alternative to the GML encoding of CityGML, which can be verbose and therefore complex to work with. CityJSON aims at being easy-to-use, both for reading datasets, and for creating them. It was designed with programmers in mind, so that tools and APIs supporting it can be quickly built. It was also designed to be compact, typically compressing publicly available CityGML files by 6x.
A CityJSON file describes both the geometry and the semantics of the city features of a given area. A CityJSON object, representing a city, is as ‘flat’ as possible, i.e., the hierarchy of CityGML has been flattened out and only the city objects which are ‘leaves’ of this hierarchy are implemented. This considerably simplifies the storage of a city model, and furthermore does not mean that information is lost.
CityJSON v1.0 was accepted as an OGC Community standard in August 2021. CityJSON version 1.0 is a JSON-based encoding for a subset of the OGC CityGML data model version 2.0.0. The candidate version 1.1 encodes the OGC CityGML 3.0 Conceptual Model Standard.
The candidate CityJSON v1.1 standard is available for review and comment on the OGC Portal. Comments are due by DATE and should be submitted via the method outlined on the CityJSON v1.1 Community Standard’s public comment request page.
About OGC
The Open Geospatial Consortium (OGC) is a collective problem-solving community of more than 550 experts representing industry, government, research and academia, collaborating to make geospatial (location) information and services FAIR – Findable, Accessible, Interoperable, and Reusable.
The global OGC Community engages in a mix of activities related to location-based technologies: developing consensus-based open standards and best-practices; collaborating on problem solving in agile innovation initiatives; participating in member meetings, events, and workshops; and more.
OGC’s unique standards development process moves at the pace of innovation, with constant input from technology forecasting, practical prototyping, real-world testing, and community engagement.
OGC bridges disparate sectors, domains, and technology trends, and encourages the cross-pollination of ideas between different communities of practice to improve decision-making at all levels. OGC is committed to creating an inclusive and sustainable future.
Visit ogc.org for more info on our work.“
The post OGC Seeks Public Comment on adoption of new version of CityJSON as Community Standard appeared first on Open Geospatial Consortium.
-
14:49
Building the Building Blocks for the Future of Location: The November 2022 OGC Web Mapping Code Sprint
sur Open Geospatial Consortium (OGC)The mechanisms through which maps are delivered across the Internet have evolved significantly over the past two decades. Advancement of such mechanisms has been driven by a combination of factors. New data formats have emerged, the SWaP-C (size, weight, power, and cost) of devices has improved, and the capabilities of web browsers have been enhanced by improvements brought by HTML5. This means that some of the functionality that web mapping applications could not implement in a standardized way, are now becoming increasingly common.
To support the development of OGC API Standards, the building blocks for location that standardize many of the new capabilities available to web mapping applications, the Open Geospatial Consortium (OGC) and EuroGeographics hosted the 2022 Web Mapping Code Sprint from November 29th to December 1st, 2022. The event was sponsored by OGC Strategic Member, Ordnance Survey, and was held as a hybrid event, consisting of a virtual element hosted on the OGC’s Discord environment alongside an in-person element hosted by EuroGeographics in Brussels, Belgium.
Code Sprints experiment with emerging ideas in the context of geospatial Standards, help improve interoperability of existing Standards by experimenting with new extensions or profiles, and are used for building proofs-of-concept to support standards-development activities and the enhancement of software products. Non-coding activities such as testing, working on documentation, or reporting issues are also conducted during a code sprint. In addition, the code sprints’ mentor stream provides an excellent opportunity to onboard developers new to the Standards.
The 2022 Web Mapping Code Sprint focused on the following:
-
OGC API – Tiles Standard: This Standard describes API building blocks that can enable implementations to serve map tiles, vector tiles (tiled feature data) or tiled coverage data.
-
OGC API – Maps candidate Standard: This candidate Standard describes API building blocks that can enable implementations to serve spatially referenced and dynamically rendered electronic maps.
-
OGC API – Styles candidate Standard: This candidate Standard describes API building blocks that can enable implementations to manage and fetch styles that consist of symbolizing instructions that can be applied by a rendering engine to features and/or coverages.
-
OtherStyles & Symbology Encodings (e.g., SLD, SymCore, etc.)
The mentor stream of the code sprint featured two tutorials, about understanding and using one server side and one client side implementation of OGC API - Tiles. It also included two onboarding sessions, focused on collaborating in software projects that implement the standards.
The code sprint successfully facilitated the development and testing of prototype implementations of OGC API Standards, including candidate Standards, that relate to web mapping. Further, the code sprint provided a foundation for the development of the next version of the Symbology Core Standard. Participants were able to provide feedback directly to the editors of the Standards and the editors were able to clarify any issues encountered by the sprint participants and the sprint also raised awareness about the Standards. The code sprint therefore met all of its objectives.
OGC is an international consortium of more than 500 businesses, government agencies, research organizations, and universities driven to make geospatial (location) information and services FAIR?—?Findable, Accessible, Interoperable, and Reusable. The consortium consists of Standards Working Groups (SWGs) that have responsibility for designing a candidate Standard prior to approval as an OGC Standard and for making revisions to an existing OGC Standard. The sprint objectives for the SWGs were to:
-
Create awareness about OGC Standards;
-
Develop prototype implementations of OGC Standards, including implementations of draft OGC Application Programming Interface (API) Standards;
-
Test the prototype implementations;
-
Provide feedback to the Editor about what worked and what did not; and
-
Provide feedback about the Standards and candidate Standards.
EuroGeographics is a not-for-profit organization that represents many of the National Mapping, Cadastral and Land Registration Authorities across Europe. The organization facilitates access to data, services, and expertise, as well as supporting the sharing of knowledge across the continent. The organization also publishes a product called Open Maps for Europe, which provided a useful resource for sprint participants. For example, within the first day of the code sprint, the sprint participants had implemented an OGC API -Maps façade in front of a Web Map Service (WMS) that was serving maps from the Open Maps for Europe product.
Ordnance Survey (OS) is the National Mapping Agency of Great Britain. OS publishes printed and digital maps, as well as offering access to the maps and data through a variety of APIs. In September 2022, OS launched the OS NGD API suite of products that implement a number of OGC API Standards. The Web Mapping Code Sprint therefore provided an opportunity for OS to directly support the advancement and implementation of the OGC API Standards on which the new OS NGD API products are built. The code sprint also provided an opportunity for OS engineers to directly engage with the editors of the Standards. Such access to editors and SWG members greatly accelerates development of applications.
Several more OGC Code Sprints are planned for the year 2023. To keep up to date with the latest plans, please visit [https:]]
-
-
16:41
OGC Seeks Public Comment on Creating new GeoDataCube Standards Working Group
sur Open Geospatial Consortium (OGC)New Standard Working Group will improve data interoperability of GeoDataCube for Analysis Ready Data. The public comment period for the GeoDataCubes SWG will end 19 January, 2023.
December 19, 2022: The Open Geospatial Consortium is seeking public comment on the creation of a GeoDataCube Standards Working Group (SWG). The GeoDataCube SWG will enhance the interoperability between existing datacube solutions, simplify the interaction with different datacubes, and facilitate the integration of data from multiple datacube sources. By following a user-centric approach, the SWG will develop solutions that meet the needs of scientists, application developers, and API integrators.
The goal of the OGC GeoDataCube SWG is to create a new API specifically to serve the core functionalities of GeoDataCubes such as access and processing and to define exchange format recommendations, profiles, and a metadata model. The SWG also aims to analyze usability of already existing Standards and identify use cases.
Similar to other OGC APIs, the GeoDataCube SWG will create this new standard from existing building blocks such as existing geospatial Standards, previous OGC innovation initiatives, and other developer resources in a very use-case driven approach, i.e., with a small core and possible extensions. This will allow for interoperability across future OGC Standards.
With regards to existing and emerging OGC standards, the working group may look specifically at:
-
OGC API – Environmental Data Retrieval: A family of lightweight interfaces to access Environmental Data resources.
-
OGC API – Coverages: Defining a Web API for accessing coverages that are modeled according to the Coverage Implementation Schema.
-
OGC Analysis Ready Data SWG products: proposed Standards to describe specific product types that are often implemented as Geodatacubes.
-
OGC API – Processes: Supporting the wrapping of computational tasks into executable processes that can be offered by a server through a Web API.
-
Zarr: An OGC Community Standard for the storage of multi-dimensional arrays of data.
-
GeoTIFF and Cloud Optimized GeoTIFF: A format used to share geographic image data.
-
Hierarchical Data Format (HDF5): A set of formats designed to store and organize large amounts of data.
The GeoDataCube SWG will follow an agile methodology with the goal to create a first core standard within the first year. Subsequent iterations may add additional functionality. The GeoDataCube SWG will start with a use case collection and analysis phase that further informs the selection of additional starting points or other work to be considered. The targeted use cases shall reflect real world scenarios, though should allow for a rapid implementation of the GeoDataCube standards without adding unnecessary complexity.
The OGC GeoDataCube SWG plans to meet at the OGC Member Meeting in Frascati, Italy, hosted by the European Space Agency (ESA) during the week of 20 February, 2023. Additional information about the meeting will be provided through OGC mailings.
The public comment period for the GeoDataCubes SWG will end 19 January, 2023, and the page is available at [https:]] .
About OGC
The Open Geospatial Consortium (OGC) is a collective problem-solving community of more than 550 experts representing industry, government, research and academia, collaborating to make geospatial (location) information and services FAIR – Findable, Accessible, Interoperable, and Reusable.The global OGC Community engages in a mix of activities related to location-based technologies: developing consensus-based open standards and best-practices; collaborating on problem solving in agile innovation initiatives; participating in member meetings, events, and workshops; and more.
OGC’s unique standards development process moves at the pace of innovation, with constant input from technology forecasting, practical prototyping, real-world testing, and community engagement.
OGC bridges disparate sectors, domains, and technology trends, and encourages the cross-pollination of ideas between different communities of practice to improve decision-making at all levels. OGC is committed to creating an inclusive and sustainable future.
Visit ogc.org for more info on our work.
“
The post OGC Seeks Public Comment on Creating new GeoDataCube Standards Working Group appeared first on Open Geospatial Consortium.
-
-
17:45
A recap of the 124th OGC Member Meeting, Singapore
sur Open Geospatial Consortium (OGC)Tags: Member MeetingFrom October 3-7 ‘22 (and a little bit beforehand), more than 100 experts from across industry, government, and academia converged on the Lifelong Learning Institute in Singapore (with 150 more joining virtually) to strengthen global collaboration and attend OGC’s 124th Member Meeting. A big “thank you” goes out to our dedicated members that either attended in-person, or juggled lives across multiple timezones to attend virtually.
The meeting was sponsored by OGC Principal Member, Singapore Land Authority (SLA), and the Maritime and Port Authority of Singapore (MPA). The meeting carried the theme “Digital Twins: Land and Sea,” and was held in conjunction with the Singapore Geospatial Festival operated by GeoWorks.
Being our second “return to in-person” meeting really drove home how much more powerful they are compared to virtual, with a palpable energy across the many meetings, sessions, and events. Indeed, in the 8 years I’ve been the OGC Technical Committee Chair, I haven’t seen a meeting that generated more votes to approve Standards than the most recent one in Singapore. It’s great to be back in person!
Alongside the usual assortment of Standards Working Group (SWG) and Domain Working Group (DWG) meetings, the Member Meeting also saw several special sessions, such as: an Analysis Ready Data ad hoc; Digital Twins and the Marine Domain Special Session; the next Metaverse Special Session; an Urban Digital Twins Summit and ad hoc; a Land Administration Workshop; an OGC Start-ups and Scale-Up Special Session; and a meeting of the OGC Asia Forum. Throughout the various meetings and events, there was clear recognition that geospatial and “location” is everywhere – and not only that, but that it’s foundational to Digital Twins and related technologies.
Social events during the week included the usual welcome reception and ice-breaker on Monday evening, a networking dinner held at the delicious Majestic Bay Seafood Restaurant on Wednesday, and a Diversity Luncheon sponsored by OGC Principal Member AWS on Thursday.
Two areas of focus during the Member Meeting included OGC APIs and OGC’s ongoing work in Climate Resilience.
OGC APIs: The pace of release of our new OGC API Standards continues to increase. Multiple parts of OGC API – Features are now in final stages, OGC API – Tiles has its first part approved, and OGC API – Processes and OGC API – Environmental Data Retrieval have both been in use for over a year. All of these Standards are being developed and released with multiple implementations in the marketplace. Several other APIs are soon to be out for public comment and OGC is arriving at a model for documentation and governance of the building blocks upon which these APIs are constructed. The building block approach to developing implementations will be a new paradigm for Standardization – and one that we’re very excited about.
Climate Resilience: The Climate Resilience DWG is now active (and open to the public: join the mailing list here) and the Call For Participation in the Climate Resilience Pilot closes soon (don’t miss the Q&A Session on Nov 8 – join the mailing list for more info). These efforts will consider real-world use cases and investigate the role of location technology in addressing climate-related impacts on society. Numerous other OGC Working Groups also consider elements that contribute to climate science, so a forum in which these groups can collaborate will be very valuable.
A Joint OpeningThe opening of the Member Meeting was held jointly with the Singapore Geospatial Festival, and included:
Welcoming remarks from Colin Low, Chief Executive of the Singapore Land Authority; Captain M. Segar, Assistant Chief Executive of the Maritime and Port Authority of Singapore; and Dr. Nadine Alameh, CEO of OGC.
Following this, Dr. Amy Khor, Senior Minister of State for the Ministry of Sustainability and the Environment & Transport of Singapore, provided an Opening Address that highlighted the use of geospatial information and technology as critical components that have enabled Singapore’s economy and quality of life.
Trevor Taylor, Senior Director, Member Success and Development at OGC, then revealed the next Phase of the Federated Marine Spatial Data Infrastructure Pilot to undertake an Innovation Challenge for integration of terrestrial, maritime, and cadastral geospatial information (more informally, “land-sea interfaces”) as demonstrated in the context of Singapore.
Videos of Nadine’s opening remarks on ‘What OGC does’ and my overview of ‘How OGC does it’ are publicly available via OGC’s YouTube channel. OGC Members can access the rest of the presentations and recordings on this page in the OGC Portal.
Special Sessions
Attendees of the 124th OGC Member Meeting, Singapore (click to enlarge)The Analysis Ready Data ad hoc session explored the interest in OGC members for developing a multi-part Standard to address the framework and domain-specific parameters for generating analysis-ready data from Earth Observations. The work is intended to be jointly undertaken with ISO/TC 211. OGC Members can access the presentations and a recording on this page in the OGC Portal.
Digital Twins and the Marine Domain Special Session: Per the overall theme of this Member Meeting, the Marine Domain Working Group (DWG) held a half-day session on Digital Twins and the Marine Domain. The session began with a summary of the technological and policy environments in which marine digital twins must be framed, then included more detailed highlights of past and ongoing initiatives being operated under OGC’s Innovation Program. A Digital Twin Challenge is being proposed to be focused in Singapore. Participants then engaged in a discussion of future activities to be undertaken by the Marine DWG, including some in concert with partner organizations. OGC Members can access the presentations and a recording on this page in the OGC Portal.
Metaverse Special Session: The Metaverse discussions in OGC are converging on a proposal to create a new DWG to address the various OGC activities in the Metaverse and coordinate with affiliated organizations, such as the Metaverse Standards Forum, of which OGC is a founding member. OGC members are already working actively in metaverse services now, such as were demonstrated in the session by Hexagon, Esri, and Cesium. Clearly, the Metaverse must include digital twins, and the OGC CDB Standard has relevant capabilities. OGC Members can access the presentations and a recording on this page in the OGC Portal.
Urban Digital Twins Summit: The past two OGC Member Meetings have been building interest and focusing the landscape for further work on Urban Digital Twins. The Smart Cities DWG is rechartering to address the topic. The DWG held an Urban Digital Twins Summit to highlight perspectives from members’ organizations on what type of work is occurring or is forecast to happen in the domain as well as to identify where OGC can best contribute to common enablement of Urban Digital Twin infrastructure for a variety of use cases. The Summit was linked to further exploration in the 3DIM DWG and MUDDI SWG. OGC Members can access the presentations and a recording on this page in the OGC Portal.
Land Administration Workshop: The Land Administration DWG hosted an Interoperability Workshop to detail real examples of implementations of the ISO Standards for Land Administration and to discuss the encodings and links necessary to enable replicable land administration practices. The two blocks of content provided presentations on implementations of the Standards followed by discussion of successes and remaining work. OGC Members can access the presentations and a recording on this page in the OGC Portal.
OGC Start-ups and Scale-Up Special Session: OGC continues to attract new start-up companies and those scaling to more significant market presence. Five OGC small business members highlighted their capabilities: ALTZ Technologies, i-bitz, Duality Robotics, XYZT, and KorrAI. OGC Members can access the presentations and a recording on this page in the OGC Portal.
Most OGC Member Meetings include a session dedicated to the local regional forum. The OGC Asia Forum met on Friday with a wide variety of presentation topics from throughout the region. OGC Members can access the presentations and a recording on this page in the OGC Portal.
Today’s Innovation, Tomorrow’s Technology, and Future DirectionsDue to its broad applicability, the popular Future Directions session runs unopposed on the schedule so that all meeting participants can attend. At this meeting the session focused on Reference Architectures. Dr. Gobe Hobona of OGC introduced the session and was then followed by interactive presentations to gauge member interest in advancing Reference Architectures.
Dr. Sam Meek (Helyx), Dr. Ingo Simonis (OGC), and Rob Atkinson (OGC) jointly presented the considerations made to date on establishing a Reference Architecture or references to architectural elements that can be used in Standards and Standards-based architectures. The topics included the following:
- Modernization of the OGC Reference Model (ORM) in the light of its original purpose and how a new ORM might be better used for assisting in architectural guidance. A discussion with members highlighted that OGC Standards fit within a larger Information Technology (IT) ecosystem and should not necessarily be described fully independent of that ecosystem.
- A brief summary of ongoing discussions in the OGC Architecture Board (OAB) about the future of the OGC Abstract Specification, specifically whether some Topics should be retired and new Topics added that underlie newer OGC Standards and reflect modern IT practice.
- The question was raised whether the ORM should document architectural elements and patterns or whether the ORM should drive architectural decisions in developing Standards and implementations. The ORM is forecast to become more prescriptive.
- Finally, Rob Atkinson discussed the practical approach to describing and relating architectural components via a knowledge graph. Such an approach is prototyped using OGC’s Definitions Server. Some further information about the OGC Definition Server is available here.
OGC Members can access the presentations and a recording on this page in the OGC Portal.
Closing PlenaryFor this meeting, and from here on out, the Closing Plenary has been restructured to fall across two sessions – Important Things and the traditional Closing Plenary.
Important Things: this session started with a rapid, 15-minute summary of the entire meeting week by Scott Simmons. Slides and content from a large number of Working Group sessions were included. OGC Members can access the presentation on this page in the OGC Portal.
The Important Things session then featured a discussion around the question of “what is the difference between the metaverse and a digital twin?” There was a lively conversation amongst members to define each term and what was included in the scope of the term. The general consensus is that digital twins represent items that can be found in the real world (imagined or not) and that the metaverse is the environment in which actions might occur on or around digital twins. There were far more subtleties in the conversation than can be summarized in this blog post, but notes from the session are posted publicly on the OGC Member Meeting Topics GitHub Repo.
Closing Plenary: Two new Principal members to OGC, the General Authority for Survey and Geospatial Information (GASGI), Saudi Arabia and the National Department of Agriculture Land Reform and Rural Development, South Africa, each presented their duties and use of Standards. These presentations were followed by reports from the Working Groups that covered the outcomes of the last few days.
Thank you to our communityAll in all, our 124th Member Meeting was a big success. It was wonderful seeing members interacting, collaborating, and driving technology and standards development forward. It’s especially exciting as it comes at a time when geospatial is truly everywhere. Once again, thank you to our members for their time and energy, as well as their dedication to making OGC the world’s leading and most comprehensive community of location experts.
Be sure to join us for the 125th Member Meeting, happening late February, 2023. Registration and further info will be available soon on ogcmeet.org. Sponsorship opportunities are also available – contact us for more info. Subscribe to the OGC Update Newsletter to stay up to date on all things OGC including when registration goes live for our Member Meetings.
-
16:01
Has the Edge dissolved itself already? Or is the Edge the new Cloud?
sur Open Geospatial Consortium (OGC)Tags: Edge computing, interoperability, vendor lock-inVisiting the Edge Computing World conference last week, I observed a number of interesting aspects that I would like to share here. First of all, the reason for my title is my observation that technical terms such as Edge or Cloud are used rather freely these days, almost causing the terms to become meaningless. Almost any aggregation node in a distributed architecture was called a “Micro Cloud” or “Cloud at the Edge.”
In doing so, the ignorance of some key characteristics of the cloud apparently mattered little; scalability or elasticity first and foremost. From the perspective of the edge node, the aggregation node does not offer any characteristic cloud functions (the aggregation node is basically a black box, an aggregation and processing unit without any other properties). From the point of view of nodes located higher up in the hierarchy, the aggregation node also does not offer any characteristic cloud functions. Of course, additional edge nodes cannot simply be created; after all, we are usually talking about hardware here and not about virtual capacities such as additional virtual machines.
It makes little sense to reduce Edge and Cloud to such a small subset of their properties, as both terms then degenerate into empty shells rather than meaningful concepts. If one does so, however, the question arises as to whether Edge is already in the process of dissolution, or whether we need to redefine terms (although there is actually little reason for this in purely factual terms) to reflect the changed view of the various layers within the Edge-Cloud continuum. This raises the suspicion that business-motivated forces are at play.
Edge Interoperability? Or Vendor Lock-In?This brings me to another aspect: interoperability. With traditional service providers now selling hardware (e.g., Amazon Snow) and traditional hardware providers selling services (e.g., Schneider Electric), the number of players on both sides increases. This bears the risk of reduced interoperability, as the new full spectrum players sell the advantages of “homogeneous solutions” to their customers. As long as one remains within a system, there will certainly be advantages, since hardware and software are coordinated with each other and can fall back on system-specific exchange mechanisms. However, vendor lock-in is inevitable when vendors define their own exchange formats, interfaces, and conceptual models. As soon as different systems have to be integrated, the development of corresponding bridges or transformations will be unavoidable.
With the growing number of systems and corresponding platforms (which are two other terms that are frequently used interchangeably), the number of platform-specific formats and interfaces increases – and interoperability suffers. What has been achieved in other domains, such as Earth Observation – where agreements on standardized interfaces and data models have boosted interoperability – is still somewhat new to the Edge community. The Edge community is in the very early stages of moving toward interoperable (perhaps even open) systems that significantly simplify the generation of complex workflows across system or platform boundaries – or enable them in the first place. Going beyond individual systems is unavoidable: multi-system workflows enable deep insights into domain-specific systems or environments and are necessary to holistically address the grand challenges of our century, such as our changing climate.
Sustainable solutions for the greater goodThere is currently still a lot of money to be made with custom platforms. It remains to be seen to what extent these platforms will be suitable for addressing the major challenges of our century. Edge is in a gold-rush mood and I don't begrudge anyone developing business successes from it. However, the world is extremely complex and I doubt that this complexity can be sufficiently taken into account with the current systems trimmed for depreciation. I heard about examples of fish farms running fourteen parallel Edge systems to monitor the status of the farm. This is fourteen parallel dashboards. Other organizations report that they maintain over 100 software solutions to monitor the health status of manufacturing machines. Most of these are not interoperable, which results in additional costs as soon as several machines form a unit that needs to be monitored as a system. So, where do we stand with Edge? With over 270 trillion USD projected revenue over the next 30 years (numbers McKinsey reported at the conference), climate change alone will produce many new unicorns. Let's hope that interoperability doesn't fall by the wayside – or that a sufficient number of these unicorns make their profit with sustainable solutions that contribute to the greater good.
-
14:59
Reflections on the 2022 Joint OGC & ISO Code Sprint - The Metadata Code Sprint
sur Open Geospatial Consortium (OGC)Tags: OGC API, ogcapi, Sprint, STAC, OGC API - Records, JSON-FG, OGC API - Features, ISOOver the past two decades, standards such as ISO 19115:2003 and the OGC Catalog Services for the Web (CSW) have been integrated into several Spatial Data Infrastructure (SDI) initiatives at national and international levels. These standards leveraged the Extensible Markup Language (XML) which, at the time, was the primary encoding for data exchange in IT. In recent times, however, the uptake of JavaScript Object Notation (JSON) and Web Application Programming Interfaces (APIs) has necessitated the modernization of existing metadata and catalog approaches.
In November 2021, the Open Geospatial Consortium (OGC) and Technical Committee 211 (TC 211) of the International Organization for Standardization (ISO) held their first joint code sprint. The success of that first joint code sprint provided the foundation for a second joint code sprint, held September 14-16, 2022. The second joint code sprint, named the 2022 Joint OGC and ISO Code Sprint?—?The Metadata Code Sprint, served to accelerate the support of open geospatial standards that relate to geospatial metadata and catalogs. The code sprint was sponsored by Ordnance Survey (OS) at the Gold-level and Geonovum at the Silver-level. Unlike the first, virtual, code sprint, this sprint was held as a hybrid event, with the face-to-face element hosted at the Geovation Hub in London, United Kingdom.
The code sprint focused on the following group of specifications:
- OGC API?-?Records candidate Standard
- ISO 19115 metadata Standards (i.e., ISO 19115-1, ISO 19115-2, ISO 19115-3)
- OGC Features and Geometries JSON (JSON-FG) candidate Standard
- Spatio-Temporal Asset Catalog (STAC), which leverages the OGC API?-?Features Standard
The discussions during the code sprint covered topics such as harmonization of STAC and OGC API - Records; harvesting of metadata to populate instances of OGC API - Records; the possibility of a JSON-FG encoding for OGC API Records and STAC; the possibility of a JSON encoding of ISO 19115; and others.
The demonstrations showcased at the end of the code sprint included client-side and server-side implementations of OGC API - Records, JSON-FG, STAC, and ISO 19115 metadata. A high-level overview of the sprint architecture is shown below.
A high-level overview of the architecture of the OGC ISO 2022 Metadata Code Sprint (click to enlarge)The code sprint successfully facilitated the development and testing of prototype implementations of OGC and ISO Standards that relate to geospatial metadata and catalogs. The code sprint also enabled the participating developers to provide feedback to the editors of candidate standards. The code sprint therefore met all of its objectives and achieved its goal of accelerating the support of open geospatial standards that relate to geospatial metadata and catalogs.
The sprint participants made the following recommendations for future innovation work items:
- Initiatives to facilitate implementation of JSON-FG (e.g. three-dimensional (3D) data, cadastral data, etc.)
- Initiatives to facilitate implementation of catalogs
- Prototyping of tools for creating metadata (e.g. the automated STAC metadata crawler and dataset tagger demonstrated during the sprint)
The sprint participants also made the following recommendations for activities that the Standards Working Groups should consider:
- Outreach for promoting JSON-FG
- Code Sprint for designing profiles of JSON-FG for different communities of interest
- Documentation of the different roles of catalogs and API, as well as guidance on when to use them
- Code Sprint on versioning, possibly involving both OGC API?-?Records and OGC API?-?Features
- Exploring how to move GeoDCAT forward within OGC
This was the first hybrid code sprint (consisting of both in-person and remote elements) organized by OGC in more than two years, due to the pandemic. A record number of participants registered to attend the code sprint, exceeding pre-pandemic registration numbers. There were however, more remote participants than those attending in-person. This suggests that there continues to be significant interest in code sprints, and that the online collaboration environment that OGC uses in code sprints should continue to be used post-pandemic.
From OGC, ISO, and Sprint Sponsors Ordnance Survey and Geonovum, we send a big thank you to everyone that participated. We look forward to seeing you at the next code sprint, the Web Mapping Code Sprint, on November 29th to December 1st, 2022, in Brussels, Belgium.
Further information on the Sprint is available on the Metadata Code Sprint wiki.
-
18:44
Bringing the Heat in Madrid: a recap of our 123rd Member Meeting
sur Open Geospatial Consortium (OGC)Tags: Member MeetingOGC’s 123rd Member Meeting – our long awaited return to in-person(!) – was held in Madrid, Spain, from June 13-16, 2022. And even with the heatwave temperatures approaching 40°C (104°F), it was truly wonderful to be there among our members and broader community once again. We’re all grateful for the connections that teleconferencing and video calls were able to maintain during the pandemic lockdowns, but I don’t think I’m alone when I say that they’re no substitute for seeing people in real life.
The meeting was sponsored by the EU Satellite Center (SatCen) and recognized the 30th anniversary of SatCen as an organization. More than 150 people attended the OGC Member Meeting in person, with another 200+ virtual. Attendees included key standards leaders and regional experts from industry, academia, and government.
The Member Meeting featured the usual assortment of Standards Working Group (SWG) and Domain Working Group (DWG) meetings, as well as special sessions, social events, and all the impromptu conversations, break-aways, sight-seeing, and general interaction that comes with (finally) being in-person again.
The social events included a lively networking reception and ice-breaker on Monday evening, an inspiring Women in Geospatial Breakfast on Wednesday morning, and the OGC Member and VIP Dinner, held at the delicious Restaurante Amicis La Terraza in central Madrid - with a most welcome large courtyard to stay cool(ish..) as the hot day turned into a warm evening.
Two strong areas of focus during the Member Meeting included The Metaverse/Digital Twins and Marine.
Metaverse & Digital Twins: These topics are intrinsically related, and during the Member Meeting there was a Metaverse Special Session (see below) as well as a Digital Twins Coordination Session on top of the Digital Twins DWG meeting. OGC will continue its strong focus on the Metaverse & Digital Twins – especially with an Urban Digital Twins Summit in Singapore at the 124th OGC Member Meeting in October. Industry alignment around core elements of the Metaverse continues, and OGC is an important part of these coordination activities. Expect to see linkages to other work in OGC and other organizations (such as the newly launched Metaverse Standards Forum, of which we are a founding member) as we identify the next steps to achieve interoperability in the Metaverse.
Marine: OGC’s Federated Marine Spatial Data Infrastructure Pilot continues to be successful and to add new phases of work with additional sponsors and topics. The Marine DWG is coordinating closely with the International Hydrographic Organisation (IHO) and the United Nations Working Group on Marine Geospatial Information to develop partnerships to test against real-world use-cases in this pilot and other projects. The next Member Meeting will include a special session or workshop on marine data integration.
The Kick Off
OGC CEO Dr. Nadine Alameh presenting at the Opening PlenaryTo start the week, the Kick-Off Session opened with a welcome from OGC’s CEO, Dr. Nadine Alameh. Nadine provided “OGC by the numbers” to highlight the Consortium’s success over the past few years and mapped where we will be going. She also encouraged engagement from the meeting participants, a large number of which were first-time attendees.
Next up, a keynote presentation from Lucio Colaiacomo of SatCen highlighted SatCen’s work and use of OGC Standards – as well as celebrating 30 years of activity.
Following this, our regular “fireside chat” featured Dr. Sofie Haesevoets, Senior Product Manager at OGC Principal Member, Hexagon, and Dr. Nadine Alameh highlighting Hexagon’s use of open Standards and the business value gained from their use.
Trevor Taylor, OGC Senior Director of Member Success and Development, welcomed the newest members in OGC and gave an overview of the success in recruiting and retaining members in a difficult year.
Finally, Dr. Ingo Simonis, OGC Chief Technology Innovation Officer, provided a tour of cloud-native innovation activities that have been – and continue to be – performed by OGC and its members under a number of Innovation Initiatives. Ingo described how OGC is now working on Function as a Service (FaaS) to cap all of the other service-centric cloud businesses. OGC Members can access Ingo’s presentation on the OGC Portal here. If you haven't already, you should also check out OGC Chief Standards Officer Scott Simmons' recent blog that provides an overview of The Latest on Cloud-Native Geospatial Standards in OGC.
Special Sessions
OGC Chief Technology Innovation Officer Dr. Ingo Simonis discusses some of the benefits of Cloud Native GeospatialThe week also saw several Special Sessions, including the Metaverse is Geospatial, Startups, Integrated Technologies for Climate Resilience, a Digital Twins coordination session, a Developer Workshop, and both the Europe Forum and the Iberian and Latin American Forum. Let’s go through each.
The Metaverse is Geospatial: It is impossible to represent a virtual rendition of our (or any) world without considering the importance of location. Those location/geospatial components include digital twins and semantic relationships between modeled objects and other data. The session offered presentations on the digital twin aspect of the linkage offered by geosemantics and continued OGC’s push toward anchoring the real world in the virtual. Some of the session presentations and a recording are available to OGC Members on the OGC Portal here. For more about the relationship between OGC, the Metaverse, and broader geospatial, be sure to check out our recent blog post with the same name: The Metaverse is Geospatial.
OGC Startup members are helping in “Setting the Standards.” Seven OGC small business members highlighted some of their work and were joined by industry veterans from HuaWei, SpaceTec Partners, the Location Based Marketing Association, and SparkGeo. OGC Members can access the presentations from the session on the OGC Portal here.
Climate Resilience: OGC is preparing to launch a Climate Resilience Domain Working Group (DWG). Dr. Nils Hempelmann of OGC brought together six speakers to discuss topics such as the workflow chain from the provenance of climate data to creating decision-ready data, and the use of artificial intelligence. OGC Members can access the presentations from the session on the OGC Portal here.
Digital Twins Coordination Session: OGC has a number of working groups discussing digital twins for their specific domains. The general landscape of digital twins was discussed in the Future Directions session (see below), so the Digital Twins Coordination Session focused on urban digital twins and the coordination necessary between working groups to work toward a common set of practices and Standards. OGC’s Dr. Josh Lieberman introduced the participants to the Urban Digital Twins Location Powers event outcomes and then each relevant working group presented their aspects of digital twin conception and modeling. The meeting concluded with plans to hold a more substantive summit in October at the next OGC Member Meeting. OGC Members can access the presentations from the session on the OGC Portal here.
OGC now operates a Developer Workshop at each Member Meeting. In Madrid, the workshop was held on Friday at the offices of OGC Technical Member Carto. The theme of this workshop was “Cloud-Native Geospatial” with tutorials and discussions on GeoParquet, Earth Observation application packages, STAC, COG, and more. Details on the program are available on OGC’s GitHub
The Europe Forum held a session with a diverse assembly of speakers addressing topics including the OGC Disaster Pilot, the UK Geospatial Standards Register, imagery-derived digital twins, and space technology. OGC Members can access the presentations from the session on the OGC Portal here.
The Iberian and Latin American Forum met on Friday with a program focused on the interoperability enabled by the OGC API Standards. The majority of the meeting was held in Spanish to accommodate local participants. OGC Members can access the presentations from the session on the OGC Portal here. The session is also available to view on YouTube.
Today’s Innovation, Tomorrow’s Technologies, and Future Directions
OGC's Director of Product Management, Standards, Dr. Gobe Hobona opens the always fascinating Future Directions sessionTuesday morning’s Future Directions session ran unopposed on the schedule so that all meeting participants can attend this always fascinating session. This meeting’s session focused on “Digital Twins of the Environment.” Dr. Gobe Hobona of OGC introduced the session and was followed by several speakers and a panel.
Dr. Ingo Simonis and Rob Atkinson of OGC described reference architecture aspects of digital twins. Understanding that reference architectures quickly lose value if they become stale, the speakers previewed some concepts of an underlying knowledge base from which architectures could be defined and described.
Louis-Martin Losier of OGC Technical Member Bentley Systems highlighted recent success stories around the integration of built and natural environment digital twins in a common platform. These examples included water management, geothermal power optimization, and air pollution modeling.
James Carey of OGC Strategic Member the UK Hydrographic Office presented the use of hydrographic data to represent digital twins of the ocean with examples showing how data are integrated for offshore wind farm development and the need for detailed data to more accurately model storm effects on tides.
A panel of the presenters then discussed questions from meeting participants and elaborated on some of the topics covered by other presenters.
Following the panel was three more presentations:
Piotr Zaborowski of OGC and Arne Berre of SINTEF described the ILIAD project, which is developing a model for digital twins for the ocean, with sector-specific local twins. OGC is a partner in this project, which builds upon other European Initiatives to develop digital twin architectures for the ocean.
Rashmit Singh Sukhmani of OGC Member SatSure highlighted his company’s use of remote sensed data and artificial intelligence in processing data to develop soil moisture models at high-resolution for use in agriculture, wildland fire mitigation, and more.
Tien Yen Chou of OGC Principal Member Feng Chia University presented “The Present and Future of Digital Twins in the Lifecycle Management of Disaster Early Warning” illustrating the benefit of integrating digital twins of the landscape with real-time sensor data as a practical means to mitigate the effects of disasters.
OGC Members can access the presentations from the session on the OGC Portal here.
Closing PlenaryClosing out (most of) the Member Meeting was the Closing Plenary, which began with a Keynote presentation from Philippe Cases, the CEO of OGC Associate Member Topio Networks. Philippe shared Topio’s research on the scope and scale of the Geospatial Landscape, factoring the work of more than 600 companies and a market that is projected to grow in value from USD 59.5 billion in 2021 to USD 209 billion in 2027. He highlighted four trends of particular interest: explosion of data, emergence of digital terrains, explosion of applications/platforms, and making the geospatial ecosystem FAIR (Findable, Accessible, Interoperable, and Reusable). Don’t forget to check out Topio Networks’ Location Information News Aggregator hosted on ogc.org, which is designed to help decision-makers, industry analysts, and technology developers keep pace with the accelerating pace of innovation.
A second Keynote was provided by Asif Khan, the founder of the Location Based Marketing Association and GroundLevel Insights. Asif illustrated the importance of location in business through several examples and videos. He highlighted interesting business opportunities offered by improved real-time location information and artificial intelligence in image recognition.
All in all, our return to an in-person Member Meeting was a wild success, showcasing the state of the location industry, highlighting the latest innovations coming from OGC, conducting important Standardization work, and creating and strengthening connections across the diverse OGC community.
Be sure to join us for the 124th Member Meeting, happening October 3-7, 2022, in Singapore with the theme “Digital Twins: Land & Sea.” Registration is open now.
Sponsorship opportunities are still available.
Attendees of the OGC Member and VIP Dinner enjoying the courtyard of Restaurante Amicis. -
15:40
The Latest on Cloud-Native Geospatial Standards in OGC
sur Open Geospatial Consortium (OGC)Tags: cloud-native geospatial, Member Meeting, OGC API, ogcapiI thought that it would be valuable to share an update of cloud-native geospatial activities in OGC, especially in light of our recent very successful Cloud-Native Outreach Event. This blog follows-up on the vision shared by OGC’s CEO, Dr. Nadine Alameh in April 2022 and two posts by OGC’s Visiting Fellow, Chris Holmes: Towards a Cloud-Native OGC and Towards a Cloud-Native Geospatial Standards Baseline.
For many years, OGC has been working on numerous aspects of the entire ecosystem of location data in cloud environments. Starting with Testbed 10 in 2013, OGC has been publishing engineering guidance on cloud topics, such as the Testbed 10 Performance of OGC® Services in the Cloud: The WMS, WMTS, and WPS cases. From those earliest efforts, OGC members have recognized that our approach to enabling cloud-native geospatial capabilities must be inclusive of this whole ecosystem: formats, services, architectures, and operations. I summarized this perspective at the Outreach Event discussing Advances in OGC Cloud-Native Geospatial Activities and will further elaborate in this blog post.
The cloud ecosystem is more than just the platform in which the data lives and is operated upon, but also includes: the algorithms to process information; interfaces between both humans and machines; formats to store and retrieve information; the security regime for content and access; business operations and revenue models to sustain the environments; regulatory oversight which may impact what enters or leaves the cloud; and much, much more. “Ecosystem” is truly the correct term as you can imagine an almost 1-for-1 analoge from the cloud to a natural ecosystem.
Building an EcosystemThe remainder of this blog digs into the elements of the ecosystem that OGC is addressing: interfaces, applications, encodings, and operations.
To start, we really cannot talk about geospatial in the cloud without also talking about the web: it is through web resources that so many users interact with cloud-hosted data and functions. OGC and the World Wide Web Consortium (W3C) collaborated in 2017 to publish the Spatial Data on the Web Best Practices as a means to illustrate how to make geospatial information more web-native. Web-native makes cloud-native more approachable. It is not enough to store data in the cloud in formats that improve access and analysis performance: we also need to develop APIs to discover, process, and extract information from the cloud and guide users to be able to work across cloud instances hosted by multiple providers. The impact of web-centric Standards modernization in OGC on enabling the cloud ecosystem cannot be overstated.
These APIs include OGC API - Features, foundational to accessing feature (vector) data as well as underpinning the STAC API specification, used for rapid discovery of remote sensing and other data. Extending the catalog paradigm, OGC API - Records allows discovery and access to all types of geospatial data as detailed as the record level. The architecture of these APIs allows developers to implement “just enough geo” to get to the data they need without becoming geospatial experts.
Many people identify the key use case for cloud-native capabilities to be the handling of massive data cubes, be those stacks of imagery or multidimensional scientific data sets. But just because you can store all of your data on the cloud does not mean that you want to use all of the data all of the time. OGC API - Environmental Data Retrieval (EDR) allows for complex subsetting of data cubes to return (or point to) just what is needed.
Do you need to fuse Internet of Things sensors with your massive content holdings? Leverage the OGC SensorThings API Standards. Consider that the combination of disparate data sources and dynamic sensors typically need some degree of processing to extract useful information, so implement OGC API - Processes to work between and within multiple data sets and feeds.
Processing comes in many models, but highly important these days is the use of Artificial Intelligence to distill vast quantities of data into useful information. OGC’s Artificial Intelligence in Geoinformatics (GeoAI) Domain Working Group is tackling some of the use cases and identifying targets for interoperability and even Standardization for information flow and quality. For example, the characterization of training and validation data used in GeoAI is now being standardized in the Training Data Markup Language for AI Standards Working Group. As part of this ecosystem, highly-automated data processing and analysis brings extraordinary benefits from cloud-native geospatial data.
The formats are also critically-important. I referenced a couple of blogs from Chris Holmes at the top of this post where there are excellent descriptions of several cloud-native encodings in wide (or soon to be wide) use. Understand that it is not just the structure of these encodings that make them “cloud-native,” but also the means by which the data are accessed (usually web-native, i.e., [HTTP).] Thus, many OGC-Standard encodings, such as GeoPackage, can be cloud-native. Below, I highlight several formats that are currently maturing in OGC.
OGC standardized GeoTIFF in 2019 and since that time, has been working to standardize Cloud Optimized GeoTIFF (COG) for management of raster data. Starting with the COG library, OGC has been working to document the format as a formal Standard and is nearing completion of this work. A draft specification is available as the OGC Testbed-17: Cloud Optimized GeoTIFF specification Engineering Report; the Standard won’t be too far behind.
More complex multidimensional data has proven to be efficiently encoded in the cloud using Zarr. Zarr is also in the final voting for endorsement as an OGC Community Standard. OGC’s most recently completed Testbed evaluated the suitability of Zarr for handling geospatial data cubes in the OGC Testbed 17: COG/Zarr Evaluation Engineering Report and Zarr did just fine… as did COG.
Feature (vector) data is already handled on the cloud in all types of databases that rely upon OGC Simple Features, OGC’s most widely-implemented Standard, to encode the geometry. But is this management really cloud-native, particularly with respect to streaming the data to users? Other encodings are being considered. GeoParquet is currently being incubated in OGC as a prospective cloud-native vector format. Other formats, such as FlatGeobuf are also being considered as potential Community Standards, to join existing Standards such as Indexed 3D Scene Layers and 3D Tiles, both of which offer cloud-natve capabilities, particularly with delivery of data.
Putting it together in the real worldYou have read this far and see a whole bunch of references to individual Standards and specifications that address specific parts of the cloud-native geospatial ecosystem. Putting it all together requires practical application of these technologies, Standards, and specs in concert. Operation of the cloud ecosystem requires coordination of many disciplines and sometimes new architecture designs relative to our past use of monolithic systems (such as microservices and highly-composable systems). This is where the other half of the OGC is so critical. The OGC Innovation Program operates numerous initiatives each year to experiment with or pilot the capabilities listed above against real-world scenarios and deliver documentation and examples that can be re-used for many use cases.
A search of “cloud” in the Engineering Report repository returns reference to 20 documents, each highlighting practical application of the capabilities highlighted above and more. These documents can be put in the context of the cloud-native ecosystem as illustrated below.
As you can see, the Innovation Program initiatives have touched upon many aspects of the cloud ecosystem, even if only peripherally related to location technology. These Engineering Reports reference even more work of relevance and identify specific practices that are portable across many use cases. I also recommend the recent OGC Best Practice for Earth Observation Application Package, which details packaging and deployment of Earth Observation Exploitation Platforms, generally to a cloud environment.
Development and MaturationIn summary, I’ve touched upon a lot of Standards and resources and there are many more in the OGC and through our partner organizations. Each, literally EACH, of these efforts requires considerable investment in time and resources. The dedication of OGC Members to advance this work is becoming increasingly represented in the cloud ecosystem. The fact that so many major cloud service providers (e.g., AWS, Google, Microsoft, Oracle) are OGC Members highlights the relevance of OGC’s efforts in this domain.
The Standards are being matured and we have expert guidance on deployment and management of the capabilities. Expect to see dedicated developer and implementer resources from the OGC to foster consistent use of geospatial content in cloud ecosystems. We will continue to research best practices, publish guidance, and identify capabilities offered by our members to sustain the entire location industry.
To see this work evolve in real time, be sure to attend the upcoming 123rd OGC Member Meeting in Madrid, Spain. More specifically, join our Developer Workshop on Friday, 17 June, 2022 hosted by Carto in downtown Madrid to code against the encodings, APIs, and more with experts from OGC and our member organizations.
-
13:48
The 2022 Joint OGC OSGeo ASF Code Sprint - How it went!
sur Open Geospatial Consortium (OGC)Tags: ogcapi, OGC API, Sprint, OSGeo, ASFOver the past decade, geospatial technologies and data have become more widespread in use and application. A key catalyst for the increased uptake of geospatial technologies is the interoperability achieved through the implementation of Open Standards. Another important catalyst for this increased uptake is the availability of Open Source software products that are able to extract, transform, analyze, and disseminate geospatial data.
Back in February 2021, the Open Geospatial Consortium (OGC), the Apache Software Foundation (ASF), and the Open Source Geospatial Foundation (OSGeo) held their first joint Open Source Software and Open Standards Code Sprint (for full technical outcomes, see the Joint OGC OSGeo ASF Code Sprint 2021 Summary Engineering Report).
The success of that first joint code sprint provided the foundation for a second joint code sprint in March this year. The 2022 Joint OGC OSGeo ASF Code Sprint, conducted between March 8-10, had the goal of accelerating the support of open geospatial standards within the developer community.
Part of the motivation for holding the code sprint in 2022 was the growing uptake of location information across the global developer communities. The code sprint brought together developers of Open Standards, Open Source Software, and Proprietary Software. The code sprint therefore provided a rare opportunity for developers across these communities to focus on common challenges, within a short space of time, and in a shared collaborative environment.
The 2022 Joint Code Sprint introduced several changes not seen during the 2021 Joint Code Sprint:
First, Discord was used to aid in collaboration. Discord allowed both chat and video communications to be offered from within the same environment. Discord also supported the creation of multiple chat channels, thereby making it possible for separate projects to have their own dedicated chat channels. Included in these channels was a dedicated chat channel for the event sponsor, OGC Strategic Member Ordnance Survey, which made it possible for sprint participants to visit the channel and ask about the sponsor’s products.
Second, the code sprint offered Mentor Streams that presented tutorials for developers who were new to using featured standards or software products.
Over a period of 3 days, the sprint participants collaborated on a variety of coding and documentation tasks, and held discussions to facilitate coordination. The sprint participants made the following recommendations for future innovation work items:
- Prototypes of catalogs that can be crawled by an application. While there are currently several searchable catalogs, no catalogs can yet be crawled through by applications.
- More specification validation work for OGC API Records.
- More experiments for the Workflows extension of OGC API Processes. This could try out a variety of workflow approaches.
- Experimentation on how a processing server can interact properly with other OGC API implementations that serve data. For example, in this code sprint there was an implementation of OGC API Processes (ZOO Project) that interacted with an OGC API Features implementation (MapServer).
- Experimentation with OGC’s geoparquet candidate standard and Apache Arrow.
The sprint participants also made the following recommendations for things that the Standards Working Groups should consider:
- To improve examples and documentation related to OGC API Records.
- To advance the development of the Executable Test Suites of OGC API Processes, OGC API Tiles, and OGC API Coverages.
The code sprint facilitated the development and testing of prototype implementations of OGC standards, including implementations of draft OGC API standards. Further, the code sprint also enabled the participating developers to provide feedback to the editors of OGC standards. Furthermore, the code sprint provided a collaborative environment for OSGeo and ASF developers to fix open issues in products, develop new features, improve documentation, improve interoperability with other libraries/products, and develop prototype implementations of OGC standards. The code sprint therefore met all of its objectives and achieved its goal of accelerating the support of open geospatial standards within the developer community.
Keep your eye out for the forthcoming Joint OGC OSGeo ASF Code Sprint 2022 Summary Engineering Report that will document the technical achievements of the code sprint, available July ‘22.
For information on OGC Sprints, including outcomes and Engineering Reports of previous Sprints, as well as info on future Sprints, visit the OGC Sprints webpage.
-
17:02
The Metaverse is Geospatial
sur Open Geospatial Consortium (OGC)Tags: metaverseA version of this article originally appeared in the Winter 2021 issue of GeoConnexion International Magazine.
With momentum and interest once again building around the ‘metaverse’, OGC hosted a ‘Metaverse Ad-Hoc Session’ at its virtual 121st Member Meeting in December 2021. The session saw speakers from across industry - from photogrammetry and AI-enhanced semantic remote sensing companies to geospatial, BIM, and gaming software companies - discuss how geospatial tech will inform the metaverse, how the metaverse will transform geospatial, and why open standards will be critical for the metaverse’s success.
What’s a metaverse, anyway?But before we get too far, what even is the metaverse? I asked Patrick Cozzi, CEO of (OGC Member) Cesium, co-host of the Building The Open Metaverse podcast, and panellist at the Metaverse ad-hoc.
“Ask ten different people, you’ll get ten different answers, but what most folks are agreeing on is that the metaverse is a progression of the internet to go from something that’s 2D to fully immersive 3D,” said Patrick Cozzi. “You’ll also hear definitions around it being a persistent, virtual world that allows collaboration of any sense, from gaming, to enterprise, to DoD [Department of Defense] cases. I think it’s a super exciting time to be in geospatial as this all comes into one place.”
This lines up with the definition put forward by venture capitalist Matthew Ball, who has written extensively on the subject of the metaverse in his Metaverse Primer:
“The Metaverse is a massively scaled and interoperable network of real-time rendered 3D virtual worlds which can be experienced synchronously and persistently by an effectively unlimited number of users with an individual sense of presence, and with continuity of data, such as identity, history, entitlements, objects, communications, and payments.”
Panelists at the recent OGC Metaverse Ad-Hoc SessionBut what will the metaverse look like to the end-user? First of all, virtual/augmented reality hardware won’t be mandatory: just like the internet, it will adapt to the device accessing it, whether it be 2D, 3D, small screen, big screen, headset etc. Also like the internet, the metaverse will comprise of many different interconnected 3D ‘spaces’ (like 3D websites) operated by different entities that together form the much larger metaverse concept.
Metaverse spaces will include those forming completely fabricated virtual worlds as well as those that are modelled after, or augment, the real world. Metaverse spaces will be interconnected, with users being able to cross between them, whether it’s to visit a friend, play a game, go shopping, manage a construction project, train for a new job, model a new warehouse workflow, or something else entirely.
Users may also be able to extend and affect the real world with actions and items being able to move between both. For example, items purchased or earned in a shop on a virtual High Street in the metaverse could be redeemable at its real-world counterpart, or buttons pressed in the metaverse could actuate machines or objects in the real world.
Those metaverse experiences representing the real world are the most obvious place where geospatial technologies, standards, knowledge, and best practises will play a major role. However, every metaverse space will be a massive database of physical and semantic environments that needs to be designed for efficient streaming. A metaverse space, then, can be considered an iteration of the geospatial industry’s city- or state-wide ‘digital twin’ technologies in use today for modelling & simulation, citizen engagement, and more. As such, just about any 3D Geospatial Standard will be useful in building the metaverse.
Also worth noting is that the laws of geography that underpin geospatial technologies will also apply to entirely virtual worlds: users will want maps to navigate and make sense of virtual spaces just as they do the real world. As an industry, geospatial clearly has much expertise to contribute to the creation of the metaverse.
Geospatial will be transformed by the metaverse
Despite the tropes, you won’t need a VR headset to enjoy the metaverse - it will adapt to the device accessing it.The metaverse is the internet transformed by real-time 3D technologies, but the impact of real-time 3D is also transforming geospatial. The blurring of the lines between ‘real world’ digital twins, and virtual metaverse spaces is exemplified by the integration of geospatial data into game engines, which enable the rendering of photo-realistic 3D scenes in real-time using consumer hardware.
“Game engines are really changing the game for GIS,” said Marc Petit, VP, General Manager, Unreal Engine at Epic Games, and co-host of the Building The Open Metaverse podcast, during the OGC Metaverse Ad-Hoc Session. “I think these [real-time 3D] technologies are really enabling for GIS, and the science of ‘knowing where things are’ is going to be hugely important in the metaverse.”
Philip Mielke, 3D Web Experience Product Manager at Esri, shared a similar sentiment: “We have about 4 or 5 years until the practice of GIS is fundamentally transformed by this convergence of technologies, capabilities, and expectations… We at Esri are investing a lot in game engines so that we can transmit services for consumption in [the gaming engines] Unreal and Unity.”
The sentiment that there is now a convergence of geospatial with the immersive 3D experiences of the metaverse was also echoed by Rob Clout, Sales Manager at 3D Photogrammetry company, Aerometrex, during his presentation at the Metaverse Ad-Hoc: “3D photogrammetry has become a staple input for a huge range of industries. Whether it be BIM, AEC, virtual production, or gaming, we’re starting to see 3D data really becoming prevalent pretty much everywhere.
“So, the metaverse was really just the next step for Aerometrex: it’s a culmination of what we’ve all been doing up to this point. What we’re seeing [at Aerometrex] is that the same data that’s being used for construction of the real world is now being used for construction of the virtual world. That integration of the real world and the virtual world is key: the metaverse can’t be two completely separate things.”
The integration of Geospatial data and Game Engines - in this case Cesium’s support of Epic Games’ Unreal Engine - is a crucial stepping stone toward the metaverse.As an example of the benefits of this convergence, Patrick Cozzi discussed his experience when Cesium enabled a link between their 3D geospatial streaming platform and Epic Games’ immensely popular game engine, Unreal Engine.
“Something magical happened when we built this bridge to Unreal Engine, because I feel that we made ten years’ progress overnight. I feel like suddenly the decades of investment in games technology was unlocked for geospatial, and then likewise, all of this 3D geospatial data became available to the game technology. And that’s just one example of how when we make these open and interoperable ecosystems, we can move the field forward as fast as possible.”
Indeed, if the metaverse is all about diverse 3D experiences interoperating to form a cohesive whole, open standards and knowledge will be absolutely fundamental to its creation - just as there would be no functioning Internet without open standards, there can be no functioning metaverse without them, either.
Open Standards will underpin the metaverseInnovation surrounding the metaverse, just like in other information technologies, will move quickly. The standards that will gain traction when building the metaverse will be the ones that can move with the pace of its innovation. OGC’s new standards development ethos, as seen in our OGC APIs, builds open standards that are modular, lightweight, and extensible - allowing them to evolve alongside technology without breaking, while providing a stable baseline upon which lasting innovations can be built.
However, being a novel technology, many of the standards that will solve problems in the metaverse won’t exist when the building starts. It is likely, then, that the open technologies and specifications that bubble up as best practice while the metaverse matures will be de facto standards. Recognising the importance of de facto standards, OGC years ago developed a nimble ‘Community Standard’ process that enables snapshots of de facto standards to be adopted by OGC so that they can benefit from the stability that official standardisation brings and can be better harmonised with other OGC Standards.
Community Standards can also form useful bridges that support the convergence of previously siloed industries and domains. 3D Tiles, for example, uses technology and know-how from geospatial and 3D graphics to provide a standard for streaming massive heterogenous 3D datasets that developers from both industries can follow and build to. Other OGC Community Standards relevant to the metaverse include: Indexed 3D Scene Layer (I3S) for 3D streaming; Indoor Mapping Data Format (IMDF) for mapping and navigating indoor spaces; and in the process of endorsement is Zarr, for the storage of multi-dimensional arrays of data (also known as data cubes).
OGC Community Standards can leverage the expertise of ‘outside’ industries relevant to geospatial to build a bridge between geospatial technologies and those from their industry of origin. The Community Standards process will prove useful, then, in bringing to the geospatial community the knowledge, experiences, and technologies developed by the many non-geospatial 3D and internet organisations in the early days of the metaverse.
Similarly, the liaisons and partnerships that help bring outside de facto standards in to the OGC Community Standards process will additionally serve to bring OGC Standards out to the communities that can benefit from them, and even bring those communities - and their perspectives - in to help shape Standards development and evolution.
Building the Metaverse Together
3D geospatial technologies, such as digital twins and mod-sim, will provide valuable insight, best practices, and standards for those building the broader metaverse.It is now clear that the metaverse - the internet in real-time 3D - has never been closer. Like the internet, its creation will result in technological advancements and disruptions. Geospatial is already starting to feel this as it adopts, adapts, innovates, and integrates 3D real-time technologies such as game engines and digital twins. However, the metaverse is not assured: it will only reach its true potential if, like the internet, it is based upon open standards and technologies that are easily available to all.
“I really want to see how far we can take the metaverse,” said Patrick Cozzi, “and I believe that to take it far, fast, we need open interoperability.”
As an organisation and community that’s passionate about Findable, Accessible, Interoperable, and Reusable (FAIR) data standards, OGC will continue to: provide, design, adapt, and adopt a host of standards relevant to the metaverse; offer a neutral forum for experts from across industry to meet and share knowledge; and work as a liaison and bridge-builder between other industries involved in building the metaverse and their standards organisations.
Interested in the location-aspects of the Metaverse and want to contribute your expertise and/or meet like-minded individuals? Why not attend the Metaverse DWG Session at the Digital Twins and Metaverse themed 124th OGC Member Meeting in Singapore in October, 2022.
If you’re an OGC Member and interested in contributing to creating standards, best-practice, and/or documenting use-cases for Metaverse applications, keep your eye out for the upcoming vote for the creation of an OGC Metaverse Domain Working Group. If approved, OGC Members will be able to join via the OGC Portal here. Non-members will be welcomed to join the DWG, too, by contacting OGC using the form at ogc.org/contact.
OGC Members can download the complete recording of the Metaverse ad-hoc session from the OGC Portal.
Not an OGC Member? OGC Membership will bring your organization a host of benefits. Consider joining today.
-
16:47
Building a cloud-native future at OGC
sur Open Geospatial Consortium (OGC)Tags: cloud-native geospatial, cloud, OGC's futureMarch 2022 marked my 3-year anniversary at OGC! As I look back at those 3 years – 2 of which occurred during the tumultuous COVID-19 pandemic – I feel proud of the accomplishments and changes that we have made during this time:
- We rebranded OGC with a new look (we also have a new website in the works – expect to see it in June ‘22).
- We shifted our messaging from ‘we develop standards’ to ‘why we develop standards.’
- We now identify ourselves as a collective problem-solving community of global geospatial experts and users committed to making geospatial/location information Findable Accessible Interoperable Reusable (FAIR) - via standards, innovation activities, and partnership building.
- We invested in strategic collaborative projects connecting people, organizations, systems, and data in areas that heavily impact our society and the future of our planet: disasters, climate change, health, and agriculture (to name but a few examples from the OGC Innovation Program).
- In the process, we invested in the global geospatial community by giving away 30% of OGC’s annual revenues to our member organizations
- We accelerated the mainstreaming of geospatial via our OGC APIs and targeted investment in developer resources and connections.
- In the process, we also made OGC more accessible to the startup community by creating a startup membership category at a 50% discount to our associate membership rate.
- We championed new topics that are critical to our members and the community’s growth and success – like New Space, Digital Twins, Metaverse/real-time 3D, GeoAI, and Ethics.
As I embark on my fourth year at OGC, our mission continues to be that of making geospatial information FAIR at scale! And, in 2022, I can’t think of a more scalable (or impactful!) way to do that than via cloud-native geospatial. No one can deny that ‘the cloud’ is triggering a fundamental shift in how geospatial data is stored, shared, accessed, integrated, and analyzed. If we succeed in creating a foundation for cloud-native geospatial standards:
- Imagine the radical simplification of the effort and cost needed to share and use geospatial information!
- Imagine the explosion of innovation when we make the power of geospatial accessible to everyone!
As you might have guessed, I’ve spent some time with our inaugural visiting fellow, Chris Holmes, to get to this point. I’m thankful for his efforts in bringing the community together to accelerate our path to cloud-native geospatial. And we sure need every bit of support on this journey as it requires a sustained effort to bring every piece of location information to the cloud in standard formats to empower the development of the next generation of tools that will truly unlock the value of location.
Join me on this ride of a lifetime as we take geospatial and OGC to the next level! We are kicking things off with a Cloud-Native Geospatial Community Outreach Event on April 19-20 that is already breaking registration records. In many ways, the event is the kickoff of a global activity to accelerate and advance cloud native geospatial standards. Come join us!
-
17:46
A Strong Foundation for GeoAI Innovation
sur Open Geospatial Consortium (OGC)Tags: GeoAI, Testbed-18A version of this article originally appeared in the Autumn 2021 issue of GeoConnexion International Magazine.
Far from being a “sci-fi” tech, Artificial Intelligence (AI) already plays a crucial role in many domains and is revolutionizing existing technologies. During the last decade, AI techniques such as Machine Learning (ML) and especially Deep Learning (DL), have improved significantly due to an abundance of data coupled with advancements in high-performance computing. As with so many technology domains, these new AI capabilities have reoriented and transformed GIS and Remote Sensing, providing new solutions and greatly increasing efficiency. The application of AI technologies to solve problems experienced by the geospatial community has become known as “GeoAI.”
The geospatial science community, for example, commonly uses AI and related techniques to better harness the otherwise insurmountable volume of Earth Observation (EO) data being created for geospatial analysis across various domains - such as smart cities, environmental management, and disaster management. However, the possible applications for GeoAI are only just beginning to surface.
“I would define GeoAI as a set of methods or automated entities that use geospatial data for perceiving, constructing (automating), and optimizing spaces in which humans - as well as everything else - can safely and efficiently continue their geographically referenced activities,” said Kyoung-Sook Kim, a co-chair of the OGC GeoAI Domain Working Group.
“GeoAI can bring significant benefits to drive the next generation of service innovation in many applications, including autonomous transportation, sustainable smart city planning/implementation, augmented building and energy management, self-optimized manufacturing, epidemic outbreak prediction, personal experience augmentation, and more. However, it also faces a variety of new ethical, legal, social, and technological challenges. I believe that international standards will play a pivotal role in ensuring widespread interoperability and security benefits among the various disciplines dealing with AI.”
GeoAI is most commonly used for feature identification in Earth Observation imagery, however, much more is possible.Innovations in GeoAI powered by open standards have lower development and implementation costs, reach the market sooner, and enable seamless horizontal and vertical integration/composition of GeoAI systems. Standards will also increase the safety of the computational approaches and algorithmic techniques that power the insights provided by AI engines.
With these benefits in mind, in 2018, OGC formed the Artificial Intelligence in Geoinformatics Domain Working Group (better known as the GeoAI DWG) to identify use-cases and applications related to AI in geospatial applications. The GeoAI DWG provides an open forum for broad discussion and presentation of use-cases for AI and its related technologies in the geospatial domain, with the purpose of bringing geoscientists, computer scientists, engineers, entrepreneurs, and decision makers from academia, industry, and government together to develop, share, and research the latest trends, successes, challenges, and opportunities in the field.
The GeoAI DWG is additionally tasked with investigating the feasibility and interoperability of OGC standards to support the use and re-use of geospatial data in AI applications, as well as describe gaps and issues that could lead to new geospatial standards.
“When we proposed the OGC GeoAI DWG, we found three main issues in applying AI technologies to geospatial domains. First, there are few large-scale benchmark training datasets like ImageNet, which is used for object recognition in computer vision projects. Even though there is a massive amount of satellite imagery and point cloud data available, most of them are not ready to use for geospatial Machine Learning tasks,” said Kyoung-Sook.
“Second, compared to the related fields of image processing and natural language processing, few open tools and workflows using GeoAI are shared in GIS and other business operations,” said Kyoung-Sook. “Organizations still pay for the full development and implementation cost when adapting a new GeoAI technique into a business case, rather than building upon established open tools and solutions.
“The third issue, which I think is the most important, is supporting trustworthy and safe GeoAI technology to support both the Earth’s and humanity’s well-being. The 2018 report from the World Economic Forum (Fourth Industrial Revolution for the Earth Series – Harnessing Artificial Intelligence for the Earth) states that the most important consideration in the development of AI, regardless of the AI stage, is to ensure sustainable benefits for humanity, which include being both ‘human-friendly’ and ‘Earth-friendly.’
“Going back to the first issue, I would like the GeoAI DWG to first solve the lack of training datasets and improve the sharing of knowledge from skilled people. This lack of data and knowledge-sharing causes misuse of AI and creates biased AI models. To address this, the DWG members have started to collect and analyze AI-related applications and use-cases in our communities.
“Recently, OGC has also started the formation of the new Sample Markup Language for Artificial Intelligence Machine Learning Standards Working Group (SampleML-AI/ML SWG) to develop a standard for documenting, storing, and sharing the geospatial sample data.”
A key component of ML techniques and processes is the use of sample data - data with known provenance, consistent metadata, and quality measurements - to consistently tune and train ML applications. The lack of consistent and known sample data is hindering advanced EO science applications, causing reproducibility issues, and making it difficult to compare results across studies.
Sample data should have sufficient metadata in a machine-readable standard format, and include general spatiotemporal information and sample data-specific attributes to facilitate data discovery and query. Due to their utility in Geospatial ML applications, many academic and industrial areas have focused on creating their own benchmark datasets.
However, in order to access and share these training datasets easily and effectively, an international standard for data schema and formats is required. One solution proposed by OGC is to develop a standardized Sample Markup Language for AI/ML - based on commonly used industry standards wherever possible - that allows users to document, store, and share geospatial sample data over the web following the FAIR data management principles of Findability, Accessibility, Interoperability, and Reusability.
The formation of the SampleML-AI/ML Standards Working Group, endorsed by the GeoAI DWG, is nearly complete. The working group will then be tasked with developing an OGC SampleML-AI/ML Standard, with initial geospatial sample data categories for remote sensing imagery, moving features (typically vehicle trajectories), and related spatial content.
Further to this, OGC’s Testbed 18 has a task dedicated to Machine Learning Training Datasets. To quote the Testbed 18 Call For Participation: “The goal of this Testbed-18 task is to develop the foundation for future standardization of TDS for Earth Observation applications. The task shall evaluate the status quo of training data formats, metadata models, and general questions of sharing and re-use. Several initiatives, such as ESA’s AI-Ready EO Training Datasets (AIREO) have developed suggestions that could be used for future standardization. Other initiatives focused on the development of training data repositories, such as the Radiant MLHub, an open-access geospatial training data repository where anyone can discover and download ML-ready training datasets.” To learn more about this task, see section 2.2. Machine Learning Training Datasets in the Testbed-18 Call For Participation.
GeoAI is already doing incredible things for the location industry. However, to truly reach its potential, we need to build a strong foundation of open standards for sharing sample data and creating open tools and workflows, and open best practices for ensuring the safe and ethical use of this powerful technology.
If you’re interested in contributing to creating standards, best-practice, and/or documenting use-cases for GeoAI applications, consider joining the Artificial Intelligence in Geoinformatics Domain Working Group. OGC Members can join via the OGC Portal here. Non-members are welcomed to join by contacting OGC using the form at ogc.org/contact.
Interested parties are also encouraged to attend the next OGC Member Meeting, where the OGC GeoAI DWG will likely meet (TBC), along with many other related OGC Standards and Domain Working Groups.
To learn more about and participate in Testbed-18, see the announcement for the Testbed-18 Call For Participation or visit ogc.org/testbed18. Applications for funded participation close March 31, 2022.
GeoAI shows great potential for use in energy management, self-optimized manufacturing, and many other industrial fields. -
13:59
Lowering the barrier of entry for OGC Web APIs
sur Open Geospatial Consortium (OGC)Tags: Testbed-17, DAPA, Convenience APIs, ogcapi, EO, cloud, cloud-native geospatialA version of this article originally appeared in the May/June 2021 issue of GeoConnexion Magazine under the title ‘Lowering The Barrier To Entry.’
For the last few years, OGC has been modernizing our standards to better align with web best-practices and the expectations of developers and consumers alike, resulting in our growing OGC API family of standards. Part of this effort has also been to design our standards to better take advantage of cloud infrastructure, including being able to deploy and share spatial analysis workflows across different cloud providers. This approach will benefit collaboration, transparency, and accessibility of scientific workflows - which are all cornerstones of the “Open Science” movement. I previously touched on these Earth Observation processing packages in the “App Store For Big Data” article in the July/August 2020 issue of Geoconnexion Magazine, which discussed our “Applications-to-the-data” architecture, and the Application Deployment and Execution Service (ADES) APIs.
The DAPA Convenience APIAnother part of the effort to simplify Earth Observation data processing and analysis workflows is the development of the OGC Data Access and Processing API (DAPA). Developed as a draft specification during OGC Testbed-16 in 2020, and having been tested in real-world scenarios during our Testbed-17 initiative and EO Apps Pilot, DAPA is a so-called “convenience API” that allows scientists and other geospatial analysts to run several operations on Earth Observation or other data using a single API call, in turn providing the data in a form directly ready for further analysis. This differentiates DAPA from existing APIs such as OGC API - Features or OGC API - Coverages. Where the latter two are data-centric APIs with a focus on data access and subsetting, DAPA is a user-centric API that includes data access with processing. As such, it takes lots of processing burden away from the user.
DAPA does this in a way that is mostly independent of the data location - meaning that the same call can access data stored in a local file, an in-memory structure (such as an xarray), or remotely in the cloud. Ultimately, this means that an end-user can initiate the process on one archive for one set of data, then just change the URL to have the same process(es) run on an entirely different dataset.
For example, with a single DAPA API call, you can say “please give me all the data you have for this specific area and time window, with these fields and as a result of this map algebra.” A data cube is then created on the fly and delivered to you in a form ready for you to work on in your software of choice. And, say you run that on a Landsat archive, you could then use the same API call to reproduce it on, say, a PeruSat-1 archive.
Reproducibility and Open Science
DAPA and ADES fit within a spectrum of different processing APIs available from OGC [click to enlarge]The reproducibility of the DAPA calls, just like the packagability of the ADES ‘apps,’ makes them ideal for use in support of the Open Science movement. The Open Science movement aims to make scientific research and its dissemination more accessible - by professionals and amateurs alike - and to generate transparent and accessible knowledge that is shared and developed through networks of collaboration. The movement is receiving big support from OGC Strategic Members NASA and ESA, who have also been instrumental in their sponsorship of the development of DAPA and ADES. Indeed, open standards in general, due to their enabling of the interoperability required for collaboration across organizations and disciplines, play a critical role in Open Science, too.
The synergy between OGC’s mission for FAIR (Findable, Accessible, Interoperable, and Reusable) data standards and their benefits to the reproducibility of scientific research has led to the topic of ‘Identifiers for Reproducible Science’ to be explored in this year’s Testbed-18 Initiative. The task shall develop best practices to describe all steps of a scientific workflow, including: input data from various sources such as files, APIs, data cubes; the workflow itself with the involved application(s) and corresponding parameterizations; and output data. By accurately describing the workflows of scientific studies, the studies can then be better reproduced and scrutinized - both hallmarks of the scientific process.
The Open Science movement’s desire to make scientific data and processes accessible by more than just scientists ties in well with OGC’s recent efforts to design standards with a strong end-user-centric perspective, rather than the data-provider centric view that has come to dominate earlier standardization work. This means simplifying and improving not just their form and function, but also their documentation.
User-friendly standardsRather than reading Standards Documents - which are, by their nature, meticulously defined to reduce room for interpretation and therefore tedious to read - many developers favor an approach that starts with simple documentation and examples. From there, additional features are explored stepwise, with the actual standard document often being the last resource being consulted. As location tech grows outside of the traditional geospatial spheres of expertise, this user-centric view becomes critical if the benefits of widespread standards adoption are to be realized.
With this in mind, work undertaken in Testbed-17 lowered the barrier of entry to implementing and accessing DAPA and other OGC APIs by creating sets of example code for both server- & client-side software, scripts for cloud deployment & installation, and best practice guides. To this end, Testbed-17 delivered the Engineering Report Attracting Developers: Lowering the entry barrier for implementing OGC Web APIs, which provides the knowledge necessary to develop, deploy, and execute standards-based Web APIs to Web developers, following a "How-To" philosophy with lots of hands-on experiments, examples, and instructions. Better yet, by providing scripts that illustrate the deployment and operation of API instances on local machines as well as across different cloud environments, it will make the challenge of mapping software components to cloud infrastructure a smooth experience.
In addition to the documentation, code examples, and implementations meant to make the lives of users easier, OGC has also recruited its first Developer Relations (DevRel) staff member, Joana Simoes. Joana provides an interface between OGC and the developer community, with a specific look at addressing: What do developers need from OGC? Where do they struggle? What materials can we provide to help?
All of these activities have come from OGC’s ambitions to make our standards easier to understand and implement, and to feel more tangible than our earlier work. OGC stands behind its position that standards - through making data Findable, Accessible, Interoperable, and Reusable - unlock tremendous value and power cross-collaborations that offer many benefits to society. By making the standards themselves align with the FAIR principles, we are lowering the barriers to their adoption and spreading their value further.
What do you think OGC can do to help make our standards easier to understand and implement? Let OGC and our DevRel, Joana Simoes, know at ogc.org/contact.
If you would like to help pave the way towards new levels of interoperability in areas as diverse as Open Science, New Space, Machine Learning, and Building Energy, consider participating in OGC Testbed-18. The Call For Participation closes March 17th, 2022. Funding is available for participants to recoup a significant portion of their costs.
-
23:36
7 Key Takeaways from the OGC Climate Change Special Session
sur Open Geospatial Consortium (OGC)Contributed by: Steve Liang, SensorUpThe first of a two part series, Steve Liang, founder and CTO of SensorUp shares highlights from the OGC Member Meeting Climate Special Session in December, 2021, and the many challenges and opportunities it presents from a data sharing perspective. The original article can be found here.
“If you can’t measure it, you can’t manage it.”
The quote is originally from management consultant Peter Drucker and later used by Al Gore to describe the challenge with climate change. It accurately encapsulates the theme of the Climate Change Special Session at the OGC (Open Geospatial Consortium) Climate Member Meeting for 2022.
SensorUp’s CTO Dr. Steve Liang was on the panel of data experts from NOAA, United Nations’ IPCC, NRCan and ECMWF, each of whom spoke about the current state, the challenges and the opportunities of measuring climate change data. Here, we’re highlighting seven key takeaways from the session.
1. We still have a lot of knowledge gaps when it comes to global climate data
Angelica Gutierrez, Lead Scientist for NOAA (The National Oceanic and Atmospheric Administration) talked about the struggles with obtaining accurate and timely data. “Well developed countries have access to sophisticated software, specialized equipment and skills, computing power and other essential elements to address climate change,” said Gutierrez. “Developing countries are at a disadvantage.”
It’s a known problem, and one that OGC members are already working to address. That’s another theme that emerged a number of times during the session — we are becoming more aware of our blind spots and working on solutions to mitigate them. “The 2021 OGC Disaster Pilot (that drew the largest response to an OGC pilot, historically) is addressing many of the challenges, gaps and barriers that I previously identified,” said Gutierrez.
2. The current priority is getting good data to decision-makers
In 2022, OGC is launching another pilot, the Climate Change Services Initiative, which will run from 2022 through 2026. The pilot will connect several global agencies and focus on sharing priority information. “We are rolling out the first focus area this year,” said Nils Hempelmann, an OGC Project Manager and the moderator of the climate session.
“Setting up the appropriate infrastructures to deliver information on demand to the decision makers, that’s what we are going to focus on in the beginning,” said Hempelmann of the new pilot. “And then afterwards, depending on what’s coming up and where the urgent pain points are, we are defining the next focus areas.”
3. We want to be able to more accurately measure and understand specific climate events
In recent years, several severe weather disaster events have wreaked havoc in different parts of the world. Two sets of presenters addressed this issue, using examples of weather events like atmospheric rivers and “Medicanes” (hurricanes originating in the Mediterranean) that we need to do a better job of measuring. “Recently in British Columbia, throughout the month of November, they received three storm events, each one was larger than their monthly precipitation rate,” said Cameron Wilson from Natural Resources Canada.
Wilson’s co-presenter, Simon Riopel goes onto explain the challenge of measuring and predicting an event like an atmospheric river. The challenge is in getting an accurate measure of vectors of force, which have both a magnitude and a direction.
One of the current initiatives that can be useful in learning how to solve this is the Arctic SDI (Spatial Data Infrastructure) that creates a “digital arctic” with a combination of sensor data and satellite imagery.
4. (Political) decision making is based on trust
In order to give political decision-makers what they need to make informed decisions, they have to be confident in the validity of the information.
“Decision-making is based on trust,” says Dr. Martina Stockhause, Manager of the IPCC (Intergovernmental Panel on Climate Change) Data Distribution Centre. “Political decision-makers are no experts, so they rely on trust in data and the service providers. In my view trust is built on two aspects. One is the quality of the data that is accessed. That means that the quality is documented, together with the peer review process. And the second is that the result is traceable back to its sources (with data citation and credit).”
One of the ways to achieve that is using the FAIR (Findability, Accessibility, Interoperability and Reusability) Digital Objects framework.
5. We continue to fInd new ways to use machine learning to make better weather predictions
In 2021 the WMO (World Meteorological Organization) launched a competition to improve, through machine learning and AI (artificial intelligence), how to better predict temperature and precipitation forecasts up to six weeks into the future.
The team currently leading that competition is from CRIM (the Computer Research Institute of Montreal). CRIM’s David Landry explained the team’s process of downloading, preprocessing, subsetting, and reshaping the data, before they ran their AI models and presented data predictions back to the adjudicators.
Incentivizing these research teams to continue to experiment with new models, as WHO has, will help us continue to expand our awareness of how to accurately measure and predict climate change events.
6. Estimating greenhouse gas emissions is really complex
Greenhouses gases like methane and CO2 remain difficult to measure. They can’t be seen by the human eye or typical cameras, and capturing data about them remains a challenge. To achieve a more detailed and timely monitoring of emissions in support of climate mitigation actions, the countries of the world need access to more (and more accurate) information.
“The big issue is that we can’t measure emissions directly, so these emissions need to be estimated,” said Vincent-Henri Peuch from the European Center for Medium-Range Weather Forecasts and lead of the Copernicus Satellite projects. “The problem is that it is really complex.”
Satellite images are able to show the presence of fugitive greenhouse emissions at a macro scale but “the question is, can we use this information about the concentration in the atmosphere to infer some information about the fluxes of admissions at the surface?” notes Peuch. “For that, we need to combine lots of different observations, so of course interoperability is required.”
To help with these crucial measurements, CO2M, the Copernicus Carbon Dioxide Monitoring mission, is one of Europe’s new high-priority satellite missions and will be the first to measure how much carbon dioxide is released into the atmosphere specifically through human activity.
7. Accurately measuring greenhouse emissions requires multiple data sources
Dr. Steve Liang, CTO of SensorUp and Professor at the University of Calgary, spoke about the ways that disparate data sources can be combined to help craft a clearer picture of the severity and source of fugitive emissions. “Even though we know methane leaks are bad, how can we fix them, if we can’t see them?” asked Liang. “We need methane sensors to find the locations and flow rates of the leaks. However, there’s not one sensor that is the best. Multiple types of sensors have to work together, to complement each other. They all have different temporal and spatial-temporal scales, at different levels of accuracy.”
Liang explained that a combination of data from sources like handheld instruments, fixed in-situ sensors, Terrestrial Mobile Methane Mapping Systems, airborne systems and satellite imagery can be used together, in an integrated methane sensor web, to more accurately measure, understand, and even predict harmful leaks and emissions.
If you would like to read a more complete explanation of how this methane sensor web works, you can read Dr. Liang’s blog recap of his presentation.
-
18:19
How it Went! The November 2021 Geospatial API Virtual Code Sprint
sur Open Geospatial Consortium (OGC)Tags: ogcapi, OGC API, SprintFrom November 15-17, 2021, OGC and ISO/TC 211 jointly hosted the November 2021 Geospatial API Virtual Code Sprint. The code sprint focused on the refinement of the OGC API - Features Standard and its ISO version, ISO 19168.
OGC API - Features offers the capability to serve, create, modify, and query spatial data on the Web. The Standard specifies requirements and recommendations for creating APIs that follow a standard and consistent way of sharing feature data. The Standard is divided into several parts so that a service only has to use those parts relevant to its offerings, keeping it lightweight and easier to develop and maintain.
OGC API - Features - Part 1: Core (the ISO version being ISO 19168-1:2020 Geospatial API for Features) focuses on delivery of feature content. OGC API - Features - Part 2: Coordinate Reference Systems by Reference (ISO/DIS 19168-2) adds support for coordinate reference systems other than the sole CRS specified in Part 1, WGS84.
An OGC Code Sprint is a collaborative and inclusive event driven by innovative and rapid programming with minimal process and organizational constraints to support the development of new applications and candidate standards.
Over the past three years we have been refining the process for organising and hosting OGC code sprints. For this November 2021 Geospatial API Virtual Code Sprint, a new approach was trialled: using Discord to provide video, voice, and chat facilities. Also a first for the code sprints, we ran a Mentor Stream in parallel with Breakout rooms for expert developers. The Mentor Streams were designed to help developers get started with OGC API Features and ISO 19168-1:2020 standards.
Day 1 kicked off with Welcome remarks from Dr Joana Simoes (OGC DevRel) and Peter Parslow (ISO/TC 211 Chair-elect). After the welcome remarks, Clemens Portele (interactive instruments) and Panagiotis “Peter” Vretanos (CubeWerx) presented the Goals of the sprint. On Day 1 we also had discussions on queryables and geometry simplification, and Mentor Stream sessions on Sharing data through OGC API - Features led by Dr Joana Simoes (OGC), as well as another session on Introduction to SpatioTemporal Asset Catalogs (STAC) and its use of OGC API features led by Chris Holmes (Planet), Rob Emanuele (Microsoft) and Matthew Hanson (Element 84). In between the discussions and the mentor streams, there was plenty of coding.
On Day 2, we had Mentor Stream sessions on How to Load feature data into your frontend application led by Antonio Cerciello (EarthPulse) and Testing implementations of OGC API - Features for Compliance to the Standard led by Dr Gobe Hobona (OGC). There were preliminary demos of geometry simplification through OGC API - Features. Similarly, in between the discussions and the mentor streams there was plenty more coding.
On Day 3, there was further coding, as well as a Features and Geometry JSON Lightning Talk led by Clemens Portele (interactive instruments) and Peter Vretanos (CubeWerx), as well as a final demonstration session. Check out the screenshots from the final demo at the end of this article.
Lessons Learnt- There is a need to offer JSON-FG fallback geometry to support different situations i.e. when it should be there and when it should not.
- For geometry simplification, the sprint participants started with the zoom-level, scale-denominator and a number of other parameters and then by the end of the code sprint there was agreement that we should use zoom-level.
- The sprint participants wanted to support situations in which, based on the zoom level, the server could return some features and not all of them.
- A use case for clipping was also demonstrated. For example, if you are looking at New York, you should not need to get the whole of the US coastline.
- The sprint participants also made progress on how to handle JSON Schemas.
- The sprint participants will file an issue in the JSON-FG repo to look for an extension to mark something with the clipbox (artificial segment). MapML has added the capability. The alternative is always requiring an extra border. That is whether a clipbox should be allowed to go bigger than the data. For example, whether an actual geometry in a shapefile can go beyond the -180 to 180 degrees boundaries.
- The code sprint has been good for both JSON-FG and OGC API – Features.
- JSON-FG could be considered for a conformance class for OGC API – Features only after JSON-FG has been adopted as an official OGC Standard.
- STAC has a number of deployment patterns. One of the patterns exposes an OGC API – Features interface, used for search.
- The idea is to have an alignment between STAC and OGC API – Features. This alignment will benefit OGC API – Records too.
- Some of the questions are how do we document/describe metadata for the resources offered by OGC API - Features, ISO 19168-1 and their related candidate standards such as STAC and OGC API - Records.
- STAC will be a profile of OGC API Records. The STAC community is working on a definition of a Dataset Record for STAC that would be aligned with the Record concept from OGC API Records.
- The November 2021 Geospatial API Virtual Code Sprint also demonstrated the Compatibility Mode. Example scenario: If you have a 3D building then you could use JSON-FG, but if you wanted to show a simpler geometry then the server would provide GeoJSON.
The participants made the following recommendations for future work.
Innovation Program
- About delivering MUDDI data using OGC API – Features and JSON-FG
- Development of a draft specifications for new capabilities being considered for future versions.
- Implementations of the new capabilities being considered for future extensions: Common Query Language (CQL), CRUD (Create Replace Update Delete), property selection, OpenAPI 3.1, conditional requests, web caching.
- Security for OGC API Standards Pilot (this could involve the different levels of security e.g. DCS, OpenAPI). This could be a good combination with the CRUD extension.
- Further code generation tasks in future code sprints.
Standards Program
- Completing CQL
- Further alignment between STAC and OGC API - Records
- There’s an ongoing vote in ISO for Part 2. So there may be an opportunity to do some event in the Standards Program once ISO 19168-2 has been approved.
The code sprint successfully met its objectives. The sprint participants were able to discuss and prototype new capabilities. The sprint participants also found that the tutorials and Lightning Talk provided in the Mentor Stream were helpful.
Regarding the new approach for OGC Code Sprints, the sprint participants offered the following recommendations:
- Record the tutorials, so that if a participant misses one they can catch up later
- Arrange a Beginner-to-Expert Mentor Stream that takes a developer all the way through from Getting Started to more Advanced topics. This would require a 3-day programme.
- The Discord idea was really cool!
- In the future we could use the other text channels. Perhaps the first message should explain that “we are going to use this channel in a particular way…”
To learn more about the Sprint, visit the November 2021 Geospatial API Code Sprint GitHub repository.
Screenshots from the Demonstrations Ecere GNOSIS demonstration screenshots interactive instruments ldproxy demonstration screenshots CubeWerx cubeserv demonstration screenshots -
12:00
Towards a Cloud-Native Geospatial standards baseline
sur Open Geospatial Consortium (OGC)Tags: Visiting Fellow, cloud, cloud-native geospatial, STAC, COG, ogcapi, OGC APIContributed by: Chris Holmes, OGC Visiting FellowIn my previous post I laid out the vision for Cloud-Native Geospatial, but with this post, I want to get into the details of what is needed. I’ll lay out the key areas where foundational standards are needed, and then survey the current status of each area. They range from quite well-established to quite speculative, but all are eminently achievable. And then I’ll dive deep into the area I ended up focusing on the most in these last few months as an OGC Visiting Fellow.
Components NeededThere are a few key components needed to represent diverse location information on the cloud. These sit ‘below’ an API - they are simply resources and formats. Together these components provide a solid foundation to represent most any geospatial information on the cloud. They should be compatible with APIs; they may serve to be responses to requests, as JSON resources or streaming formats. But it should also be completely possible to simply store these on a cloud storage object store (S3, GCP, etc). Those in turn will often be read by more capable APIs to do cool operations, but they don’t need to.
The core that I see is:
- Core Raster format: A solid cloud-native format to handle satellite imagery, DEMs, data products derived primarily from satellite imagery, etc.
- Multi-dimensional Raster format: A cloud format able to handle massive data cubes, like the results of weather forecasts, temperature over time and elevation, climate modeling, etc. This is the traditional space of NetCDF / HDF.
- Core vector formats: A vector data equivalent to Cloud Optimized GeoTIFF would be ideal, but the diverse requirements of fast display and on-the-fly deep analysis may not be easily combinable, so we may end up with more than one format here.
- Point cloud format: A cloud format that works like COG, but enables streaming display and on-the-fly analysis of point clouds.
- Collection & Dataset Metadata: The title, description, license, spatial and temporal bounds, keywords, etc. that enable search. For the cloud-native geospatial baseline, this should focus on being ‘crawlable’, and link to actual formats. It should support diverse data types? - ?vector data, raster data, point clouds, multi-dimensional data cubes, geo-located video, 3d portrayals, etc. - and should be flexible enough to work with any data. It should be fundamentally geospatial-focused, and not try to generically describe any data.
- Granule / Scene level / ‘asset’ Metadata: A flexible metadata object with common fields for describing particular data capture domains and linking to the actual data files.
Most of these have at least the start of an answer in our worldwide geospatial community, if not a robust solution:
- Core Raster format?:?Today this is Cloud Optimized GeoTIFF (COG). It is in the process of becoming an official OGC standard and has already seen incredible adoption in a wide variety of places. It’s really the foundational cloud-native geo format that has proven what is possible. It is worth noting that it may not be the end-all for cloud raster formats, as one could see a more optimized image format that is smaller and faster. But it would likely be some more general image format that our community adds ‘geo’ to like we did with TIFF. COGs will rule for a while, since the backward compatibility with legacy tools is hard to beat while we’re still early in the transition to cloud-first geospatial infrastructure.
- Multi-dimensional Raster format?:?There’s also already a great answer here with zarr. It is in the process of being adopted as an OGC Community Standard, with the adoption vote starting soon. It’s also being embraced by NetCDF, and has seen significant uptake in the climate community.
- Core vector formats: There is as of yet no great answer here. I’ll discuss the landscape and various possibilities in a future blog post.
- Point cloud format?: Howard Butler’s new COPC format is a ‘Range-readable, compressed, organized LASzip specification’ that hits all the same notes as Cloud-Optimized GeoTIFF and likely will see rapid adoption.
- Collection & Dataset Metadata: has a solid core with the OGC API? - ?Features ‘Collection’ construct. The STAC Collection then extends that, and the OGC API? - ?Record provides a GeoJSON equivalent that can be used as a return in search queries. But these parts haven’t quite all connected in a coherent way, and the full ‘static’ (just upload to S3) usage hadn’t been fully fleshed out. This was the main focus of my work the last few months, so I’ll dive in deeper below.
- Granule / Scene level / ‘asset’ Metadata: is where the SpatioTemporal Asset Catalog (STAC) specification that’s been my main focus the last few years has played, and it’s seeing really great adoption after recently reaching version 1.0.0.
For me, the jury is still out if a web tiles specification really belongs in a true cloud-native geospatial baseline. I believe for raster tiles (png, jpeg, etc) they don’t make sense, as a Cloud-Optimized GeoTIFF can easily be transformed on the fly into web tiles, using serverless tilers like Titiler. So the pattern is to use a good cloud-native format that enables on-the-fly rendering and processing in the form clients need. Tiles are essential for browser-based clients, but other tools are better served accessing the data directly. Once the OGC API? - Tiles standard is finalized it will likely make good sense to create a ‘Tile metadata building block’ that can serve as a cloud-native format to point clients at tiles.
For vector tiles I would consider both MVTs and PBFs as cloud-native geospatial formats, in that they can sit at rest on a cloud storage bucket and be used by various applications. But I do think there is potential for a good cloud-native vector format to work like COGs, with a serverless tile server that can render vector tiles on the fly. I’ll explore this idea more deeply in a future post on vector formats.
How do OGC APIs fit in?The OGC API initiative is a reinvention of the OGC W*S baseline into more modern JSON/REST API’s. Generally, it sits one level ‘above’ the cloud-native geospatial constructs discussed here, defining API interfaces for services that would use the cloud-native formats (but could also use more traditional formats and spatial databases). They enable a lot more, like dynamic search or on-the-fly processing of data, but also require more. Most of the cloud-native metadata constructs have been extracted from the APIs, so the cloud-native variants should be compatible with the OGC APIs, just a lot less capable (though also far easier to implement).
An ideal ecosystem would see most data stored in cloud-native geospatial formats, and then a wide array of services on top of those, with most of them implementing OGC API interfaces. In the future, it will hopefully be trivial to install a server or even a serverless function that provides the richer OGC API querying on top of the cloud-native metadata and formats.
Towards Cloud-Native Geospatial Collection MetadataAs mentioned above, a majority of my OGC Visiting Fellow time the last few months has gone into sorting out a ‘cloud-native geospatial collection’. There are a few different aspects to this.
A static OGC CollectionOne of the most powerful constructs that has emerged in STAC’s evolution is the ‘static STAC’. See the Static Spatiotemporal Asset Catalogs in Depth post for a great summary of what they are and how they work. To quote the ‘best practices’ of the 1.0.0 version of the spec:
A static catalog is an implementation of the STAC specification that does not respond dynamically to requests. It is simply a set of files on a web server that link to one another in a way that can be crawled, often stored in an cloud storage service like Amazon S3, Azure Storage and Google Cloud Storage…A static catalog can only really be crawled by search engines and active catalogs; it can not respond to queries. But it is incredibly reliable, as there are no moving parts, no clusters or databases to maintain.
It has proven to be a very popular way to publish STAC data, making good on the vision in my previous blog post of being able to upload data to the cloud and have it ‘just work’.
But while STAC pioneered clear static options for both individual items of imagery and other spatiotemporal assets, as well as collections of that type of data, there had been missing an equivalent ‘static collection’ for vector data. The OGC API? - ?Feature Collection (that STAC extends for its Collection) as specified is only part of an API response, not an independent JSON resource that can be used independently. But it was a well-designed modular part, and Clemens Portele and Peter Vretanos, the editors of the Features specification, were always supportive of pulling it out.
The OGC Collection building block [click to enlarge, or follow the link to the full page]I made a rough attempt in an experimental github repository. But then Clemens ran with another idea we had been kicking around of making true small granular building blocks from the OGC API baseline (I’ll try to make a full post on that in the future). This resulted in a very clean ‘Collection’, extracted from OGC API - Features, but written as an independent JSON resource that could be re-used in any context. And thus we have a real ‘static OGC Collection’, capable of living statically on cloud storage. This can point at a GeoJSON, GeoPackage, Shapefile or any new more cloud-native format. You can see an example of this in a repository I made to experiment with examples of static collections. That one has multiple representations of the same data as different formats, but it could easily just have one.
Records + STAC alignmentAnother large chunk of my time went into work that isn’t a direct cloud-native task: fully aligning STAC with OGC API - Records. Many have been unsure of the exact relationship between the two specs, though I always had clear (but not well communicated) thoughts. So the last few months have enabled the time to fully sync with the core Records team and get agreement on the path forward. The quick version is that Records API has a real role to play in STAC, as we’ve been holding off on ‘collection-level search’, since we wanted to fully align that with Records. But the confusing part is that a Records API can also be used to search STAC-like items, and indeed is designed for search of almost anything.
So the ‘aha’ moment was realizing that the Records spec authors have always had in mind a ‘Data Record’, which is really what STAC needs, which is a bit more specific than a totally general, flexible Record. STAC’s Collection construct is really only focused on what OGC considers ‘datasets’, it’s just that there hasn’t been a clear specification of that construct in the emerging OGC API baseline. I’ve started a pull request in the Records repo to add it and then will also propose a ‘Data Collection’ which extends the core OGC Collection with additional fields. A STAC Collection in turn should hopefully align with that Dataset Collection construct. And in the future we’ll work together to have a ‘STAC Record’ that fully aligns a STAC Item with the more general record requirements.
The other cool effect of this sync has been a really nice refactoring of the core Records API specification by Peter Vretanos. The vision has always been that Records API is a Features API, but with some additional functionality (ie sort and richer querying) and a more controlled data model. The new version makes that much more clear, emphasizing the parts that are different from the core OGC APIs, and it should be much easier to align with STAC.
Static RecordsThis work all nicely laid the foundation for the next cloud-native geospatial component, a ‘crawlable catalog’ which consists of web-accessible static records! Peter put up an example crawlable catalog (and I have a PR that expands the example with a ‘static OGC collection’ with vector data), which needs a bit more work to align with STAC, but fits with all our cloud-native geospatial principles. So we’ve now got pretty much all the pieces needed for all the right metadata we need for a cloud-native geospatial baseline. Records and Collections are basically two alternate instantiations of the same core data models, one is GeoJSON, making it easy to visualize lots of them together, and the other matches the core OGC API Collection construct that is used extensively. In the short term, the best practice will likely be to make use of both of them, but in time there will likely be tools that easily translate from one to the other, especially if we get the core data models to be completely compatible.
Bringing it all togetherSo we are tantalizingly close to the full suite of static metadata needed to handle most any cloud-native geospatial data. The main task ahead is to fully align the work in OGC API? - ?Records and - Features to be compatible with STAC, and to better describe all the necessary metadata fields. There are a few ways things are a bit different right now, so it’d be nice to simplify things between the two approaches a bit.
To help show how everything could work together I’ve put up a ‘static ogc examples’ repository to demonstrate how you could have a number of diverse datasets and formats all available from a completely static structure. I’ll keep expanding it and evolving the examples, and flesh out the readmes to show what is happening. And in the future I’ll try to do a blog post going deep into the details.
Future posts will go deeper into more of the state of actual cloud-native geospatial formats. Vector data is where I’ve spent most of my time lately, as there is not a super clear answer. I hope to also spend more time highlighting zarr and copc, as those are two really great efforts that fit in well and really round out a complete ecosystem of formats.
-
14:50
Hexagon on the value of collaboration, Standards, and OGC Membership
sur Open Geospatial Consortium (OGC)Tags: ogcapi, OGC API, Hexagon, impact, Principal MemberHexagon’s long-time support of OGC and our Standards, including our family of OGC APIs, has enabled the Company to learn from, collaborate with, and support the broader geospatial community, while also improving their product offering and being one of the first to market with support for the latest generation of geospatial standards.
“Hexagon has had a storied history in OGC, beginning with Intergraph, the first commercial member. As other members like Leica Geosystems, ERDAS, Ionic, and Luciad have come together into what is now the Hexagon membership, we continue to see a great benefit to our involvement in OGC,” said Stan Tillman, Executive Manager, Hexagon’s Safety, Infrastructure & Geospatial Division. “In particular, we are Principal members in order to provide our insight to innovation and help OGC remain relevant in the geospatial world. But our membership in general allows us to learn from others in a truly collaborative environment involving development and management.
“The work with the OGC API - Processes group is a prime example: as co-chair of the group, Hexagon has helped drive the new RESTful APIs knowing this is the direction of the developer community. However, our involvement in this group has helped us learn from others in the group, involve our developer community sooner in the process, and help in planning the next phase. This give-and-take environment provides a safe place to collaborate, which is often missing in the external communities.”
As a fruit borne of this involvement, Hexagon recently launched a new suite of products - the Power Portfolio 2022 - which is one of the first offerings on the market to support OGC API Standards. The suite includes Geoprocessing Server 2022 (as part of the ERDAS APOLLO product), which exposes its APIs using an early version of the OGC API - Processes Standard. ERDAS APOLLO also includes a web-based map client, called Catalog Explorer. Catalog Explorer already supported OGC standards like WMS, WFS, WMTS, and OGC 3DTiles, but new to this release is an OGC API - Processes dynamic client interface and support for OGC API - Features.
“Hexagon is excited to see the interest in its new Geoprocessing Server from both our customers and partners. The new Geoprocessing Server empowers many more end users at the organization to create value-added data products leveraging Spatial Models or processes from other processing engines. The aim is to leverage those experts but enable any user to execute them with nothing more than a web browser and data sourced from the catalog. Not only does this increase accessibility, but it will also, in many cases, mean the outputs are created faster by utilizing more powerful server hardware, deployed closer to the data sources.
“It has been said before, but data is one of the few assets an organization has that becomes more valuable the more it is used. This is why we see geoprocessing as an important tool: it gives value to your data."
Hexagon's Spatial Modeler provides a visual tool for creating geospatial processing models that can be executed in Geoprocessing Server by anyone in the organization [click to enlarge]“For several years, Hexagon has maintained a visual tool for building geoprocesses, so geoprocessing is certainly not foreign to the Hexagon community,” said Stan. “This capability has been exposed through internal interfaces and even OGC Web Processing Service (WPS) to a limited extent. Over the last year, we have been developing a standalone, highly scalable service to be used in executing these processes, but we were not thrilled to expose this service through the xml-based WPS. The stars started to align when OGC changed its focus to a more RESTful based approach with standards defined with REST interfaces and using geojson. We felt this fit much better in our roadmap as it pertained to geoprocessing.”
OGC API - Processes is just one Standard from the new family of OGC API Standards being developed by the OGC Community. OGC API Standards define modular APIs that spatially enable Web APIs in a consistent way - making them “building blocks for location.” OGC API Standards make use of the popular OpenAPI specification, so are easy to implement and access.
“Regarding the benefits of developing interfaces based on OGC API - Processes, we see positive gains on both the backend and frontend,” continued Stan. “First, the development of the APIs has been very easy to pick up by developers that may not have had a lot of exposure to OGC in the past. Lots of tools are available to help with auto-generated code and the use of OpenAPI Specification 3.0 has been a valuable way to provide an abstracted access to our service.
“Secondly, and maybe even more important, is the benefit of easy integration. We were able to build the Geoprocessing Server as a standalone component so other groups within Hexagon could take advantage of its use. Exposing our interface using OGC API - Processes helps us to share within our own division, but we have also found it makes it easier to convince other divisions to implement based on an international standard rather than a home grown approach.”
Hexagon also recently participated in the July 2021 OGC API Virtual Code Sprint (Engineering Report here). To ensure that all of the new OGC API standards are as developer-friendly, usable, and mature as possible before release, each draft specification is being put through one or more code sprints to test and improve their ‘readiness’ before starting the OGC standards approval process. At the Sprint, Hexagon’s Steven McDaniel demo’ed the integration of OGC API - Processes into Geoprocessing Server and Catalog Explorer and how they enable anyone in an organization to easily run geospatial processing models built in the visual workflow builder, Spatial Modeler.
Hexagon's Geoprocessing Server uses OGC API - Process to enable anyone in an organization to create value-added data products leveraging Spatial Models or processes from other processing engines [click to enlarge]“By participating in the Sprint, we were able to quickly get answers to questions pertaining to the specification and test/compare our implementation real-time with other implementations,” said Stan. “One-on-one discussions with the specification creators and other implementers helped us better understand the specification. Hopefully, our input helped to smooth out the rough points in the specification and its documentation. Our participation also led to a few new capabilities needed in the specification that Hexagon felt were minimum requirements within our product's implementation.”
With such a long history at OGC, it’s great to see that Hexagon still gains so much from, and contributes so much to, membership in the OGC Community. From their collaboration with geospatial experts, to providing and gaining insight into early technology trends and standards development, OGC is proud to count Hexagon among its Principal Members.
To learn more about how the family of OGC API Standards work together to provide modular “building blocks for location” that address both simple and the most complex use-cases, visit ogcapi.org.
To learn more about Hexagon’s Power Portfolio 2022, including its support of OGC APIs, visit hexagongeospatial.com.
To learn more about the benefits of OGC Membership, visit ogc.org/ogc/membership-value.
-
12:48
Towards a Cloud-Native OGC
sur Open Geospatial Consortium (OGC)Tags: Visiting Fellow, cloud, cloud-native geospatial, STAC, COG, ogcapi, OGC APIContributed by: Chris Holmes, OGC Visiting FellowThis is Part 1 of a two-part post. See Part 2 of this blog post, 'Towards a Cloud-Native Geospatial standards baseline', here.
About six months ago I started as the first ‘Visiting Fellow’ of the Open Geospatial Consortium. It’s been a true pleasure to explore various aspects of OGC more deeply, working with staff and members. The time has flown by, and so I wanted to share my progress and some thoughts on what comes next.
The open-ended scope of the fellowship was amazing, but I realized that I’d quickly have to focus if I was to actually make an impact while working a half day a week for six months. The theme that emerged I call ‘Cloud-Native OGC’, exploring the fundamental components that enable geospatial standards on the cloud, at a level ‘below’ APIs.
This is an evolution of the ideas I presented four years ago in a blog series called ‘Cloud-Native Geospatial’, which opened with the question ‘What would the geospatial world look like if we built everything from the ground up on the cloud?’. I’ve spent much of my time since then focused on two core parts of that transformation?—?Cloud Optimized GeoTIFF’s and SpatioTemporal Asset Catalogs. We’ve seen some incredibly early adoption of both of those formats, but it’s been mostly centered on multi-spectral satellite imagery, which is only a small corner of the overall geospatial world. So my time as an OGC Visiting Fellow has been spent on a riff on that original question: ‘What would geospatial standards look like if they were built for the cloud?’ I was able to take the time to look at the entire geospatial landscape, not just imagery, and the potential for OGC to play the key leadership role in making the Cloud-Native Geospatial vision a reality.
The Cloud Native Geospatial VisionIn digging in more, I found that OGC’s existing standards work could easily evolve to align the industry on Cloud-Native Geospatial architectures. There is no organization better situated to make it a reality than OGC: it is already trusted by every government as the steward of geospatial standards and has the largest community of geospatial experts working together, across commercial, non-profit, government, and academia.
Before I go deep into details of the standards necessary to support this, it’s worth a full articulation of the future state enabled by this vision.
The OGC mission is to ‘make location information Findable, Accessible, Interoperable, and Reusable (FAIR)’. Cloud-Native Geospatial shares the exact same goal, but leverages the cloud to radically simplify the effort needed to make geospatial data FAIR. Instead of forcing data providers to stand up, maintain, and scale their own APIs, the requirement should be as simple as using the right cloud-native geospatial format and metadata, and uploading it to any cloud. All the APIs and scalability come from the cloud itself, enabling geospatial to ride the continuous waves of innovation in the broader IT world instead of continually playing ‘catch up’.
A core aim of cloud-native geospatial is to decrease the burden on data providers, and in turn enable far more geospatial data to be FAIR. The only cost that providers should need to pay is for the cloud storage, which currently is between US$1 and US$5 a month for 100 gigabytes of data. If that core data is hosted on the cloud, then general cloud-native technologies enable the cost equation to be flipped on its head, as the users of the data pay for any computation they do, and with ‘requestor pays’ the users even pay for the egress costs.
Once the data is in the right cloud-native geospatial formats then it’s easy for anyone to stand up a traditional geospatial server, ideally one making the data available as OGC APIs. But the data itself becomes FAIR, even if it’s not in an advanced API, as the cloud plus key standards provides all that is needed to provide the data.
But things get really exciting when thinking about a whole new class of cloud-native geospatial tools that can layer on top of the core FAIR data, sitting alongside traditional geospatial services. Google Earth Engine has been operating in this future for years, enabling global scale computations that run across tens of thousands of compute nodes simultaneously to deliver answers in seconds. They have done an incredible job of curating a vast amount of data, but GEE has traditionally been a walled garden where only data ingested into GEE could tap into its capabilities. In the Cloud-Native Geospatial vision, any data on the cloud could be used by GEE (and indeed they have started to embrace the CNG vision with COG registration).
More importantly, any new cloud-scale compute tool like GEE wouldn’t need to build up its own data catalog as it could just access the same cloud-native geo formats that other tools use. Having a suite of cloud-native geospatial tools with cheap data hosting then opens up the potential for a much longer tail of geospatial data to be FAIR, as smaller organizations who have valuable information but not the wherewithal to run servers will embrace cloud-native geospatial, as putting their data on the cloud will enable many awesome tools and analysis. Access to all the world’s information in one place combined with infinite scale computation, in turn, should usher in a whole new wave of innovative tools that move beyond traditional geospatial analysis to finding broader patterns. Then the line between geospatial and non-geospatial information will blur once it is cloud native, greatly magnifying the potential impact of geospatial insight - but that’s worth a blog post of its own.
Getting to a critical mass of data that is actually usable by advanced tools of the cloud then opens up the possibility of real ‘geospatial search’. The main paradigm today is that you must know or find a particular geospatial server and then you can perform geospatial searches to find the information you need. There is no ‘google for location information’ because there is no standard format to ‘crawl’ like there is html for the web. Simple metadata and data formats that live on the cloud provide the core ‘crawlability’, particularly when they have an html equivalent that is crawlable by traditional web search engines, as described by the Spatial Data on the Web Best Practices.
The key thing cloud-native geospatial enables for search is access to the actual data - you can stream it directly into diverse tools that provide real value. Previous attempts at geospatial search engines would at best show a preview image, and often only a text description, and often the actual data wouldn’t even be available for direct download: it was just a search of the metadata. With cloud-native geospatial, the search tool can stream full resolution data directly in the browser, or link to more powerful tools that enable cloud-based analysis of the search results. The cloud-native geospatial vision focuses first on getting a critical mass of data to the cloud, but once there are sufficiently valuable masses of information it opens the possibility of a whole new class of technologies and companies focused on more innovative geospatial search tools.
Towards a Cloud-Native Geospatial Standards BaselineSo how do we actually make progress towards this vision? The core standards are much closer than one might expect. But it must be emphasized that achieving this vision will take far more work from the entire geospatial industry than just releasing some standards. We need a sustained effort to bring every piece of location information to the cloud in standard formats, to update every tool to be able to work with it, and to build together a whole new class of next-generation tools that show the power of having petabytes of information about the world in one place.
This means a very solid foundation to build on, enabling layers and layers of innovation on top. But this standard baseline must also be adaptable to the overall technology landscape, to be able to ride larger tech trends (like the shift from XML to JSON and to whatever will come next). The key to this is to build small pieces that are loosely coupled, with few moving parts, and really focusing on the truly geospatial components. This approach is one of radical simplicity, getting the core atomic units right to enable unimagined innovation on top.
I’ll dive deep into a practical plan to get to a minimal viable standards baseline for cloud-native geospatial, leveraging all the great work the OGC and broader geospatial community has done. But my work under the fellowship the last few months does show that we are potentially close to the baseline, and if we work together to realize that and build out the interoperable ecosystem of tools around it then the power of geospatial information will be available to all. Those of us who work in the field know the power, and if we can build simple cloud-native interoperability that makes it easy for anyone to access our data then the impact on the world will be immeasurable.
Interested in this vision? Be sure to read Part 2 of this blog post 'Towards a Cloud-Native Geospatial standards baseline'
-
16:31
INSPIRE and OGC APIs - Part 1: Modernising INSPIRE
sur Open Geospatial Consortium (OGC)Tags: INSPIRE, ogcapi, OGC APIWithin the broader context of the European Strategy for Data, the Joint Research Centre (the European Commission’s science and knowledge service) is collaborating with the EU Member States on modernising the technological stack for INSPIRE. As part of this modernisation, there is now an effort to use data standards “as is” rather than developing INSPIRE-specific extensions to them, thus facilitating the use of “off the shelf” software for the delivery of INSPIRE data. This desire for modern, simple, understandable standards has resulted in OGC API - Features this year joining OGC’s SensorThings API as another INSPIRE Good Practice - and other OGC APIs are likely to follow.
INSPIRE is a European Union initiative that came into force in 2007 to establish an infrastructure for spatial information in Europe that is geared to help to make spatial information more accessible and interoperable for a wide range of purposes supporting sustainable development.
“2021 has been really important for INSPIRE, as the community is currently in the process of evaluating the Directive, thus ensuring that it would remain fit for purpose within the new technological and policy context in Europe,” said Alexander Kotsev, Team leader, Joint Research Centre, European Commission. “From a technical and organisational perspective, we are actively working on an updated technical framework and further simplifying the technical requirements (see, for example, the open access journal article From Spatial Data Infrastructures to Data Spaces—A Technological Perspective on the Evolution of European SDIs).
Simpler standardsObviously, the Internet and its associated technologies have changed substantially since 2007 - a time when “the cloud” was only just being understood to mean something more than impending rain, and it was still considered strange that Apple was trying to compete with the likes of Nokia in the phone market. Indeed, the accumulation of technological changes over this time has also affected how data standards are defined, developed, implemented, and used.
“If we look into how standards and their implementation is done now, compared to 14 years ago when the INSPIRE Directive entered into force, we would immediately notice several subtle differences,” said Alexander Kotsev.
“First, in the past, standards were excessively complex and tried to capture all possible use-cases - including even the most specific niche ones. This led to a substantial overhead and made standards difficult to implement and utilise. INSPIRE, fully reliant on such complex standards, inherited and further extended the requirements, assuming that hardcoding requirements in legislation was enough for clients, servers, providers, and users to follow.
“Second, a linear process was followed with a very long implementation cycle (14 years in the case of INSPIRE). Even if successful in many aspects - like establishing a community and substantially increasing the volume of available public sector geospatial data - INSPIRE fell short in certain aspects because of the high complexity and sometimes limited support by tools.
“Now, having learned from those experiences, we want to ensure that the tools are able to handle the technical solutions well, and that stakeholders can easily access the data without having to go through hundreds of pages of specific technical documentation.
“I use the opportunity to thank the OGC for facilitating this process through the many hackathons, which successfully conceptualised these web-friendly standards,” said Alexander Kotsev.
Indeed, JRC recently published “A vision for the technological evolution of Europe’s Spatial Data Infrastructures for 2030” in the form of the report INSPIRE - A Public Sector Contribution to the European Green Deal Data Space, which mentions the benefits of lightweight, agile standards - including OGC APIs - in section 6.4, ‘Agile standards.’
Bringing BenefitsThe shift to simpler, modular, web-based standards brings with it many benefits to EU Member States, data providers, data users, and software developers alike.
“The benefits are manifold,” said Alexander Kotsev. “The good practices that we have endorsed provide a pragmatic approach for ensuring the public sector contribution to the setting up of the European Green Deal data space, as it is defined in the ambitious European Strategy for Data. Within that context, through the OGC APIs, INSPIRE data will easily be reusable together with other data such as those generated by citizens and private companies.
“First and foremost, users can now easily consume the data without having to read hundreds of pages of technical documentation, but instead are empowered to immediately start interacting with the data and build working prototypes in an agile manner.
“Second, data providers, which in our case are public authorities, can use the opportunity to modernise their technological stack and do a better job of serving their stakeholders. In the API4INSPIRE study, the pros/cons, costs and benefits for the Member States’ public authorities that provide INSPIRE data to make use of OGC APIs were assessed and recommendations were given in terms of how to start using them.
“Third, ensuring that the OGC APIs are recognised as INSPIRE download services would open the market and act as a catalyst for the many open source geospatial projects and software vendors.”
Into the FutureThe endorsement of more OGC Standards as INSPIRE Good Practice is likely to continue as more OGC API Standards are finalised, and more community members propose their - and other OGC Standards’ - endorsement.
“The endorsement of new standards in INSPIRE is entirely based on the demand of the community, which is empowered to submit proposals for good practices based on the specific needs of the different stakeholders,” said Alexander Kotsev. “In addition to OGC API - Features, the OGC SensorThings API standard is already endorsed as an INSPIRE good practice, which gives us a powerful opportunity to not only share feature data, but also spatio-temporal observation data. I am quite confident that other standards will follow soon, such as OGC API - Records. Similarly, regarding data encoding, together with the community, we are working very actively on different data encodings such as GeoJSON and more recently GeoPackage,” said Alexander Kotsev.
So, 14 years after its creation - and at a time when the need for sustainable development is greater than ever - INSPIRE has evolved to continue to provide accessible and interoperable geospatial data to the European community and beyond. And, thanks to INSPIRE’s growing support of the OGC API Family of standards, stakeholders can can access and publish it in a manner that has grown simpler and more useful over time.
Alexander Kotsev offers his thanks to his “many colleagues for their enthusiasm and hard work. In particular, big thanks to Heidi Vanparys, Jari Reni, Clemens Portele, Thijs Brentjens, Sylvain Grellet, all members of the sub-group who worked on the good practice, Michael Lutz, Marco Minghini, Jordi Escriu and the whole JRC INSPIRE team, and of course all our colleagues and friends at the OGC.”
-
17:26
Esri’s ArcGIS enables thousands of datasets, maps, and apps for location
sur Open Geospatial Consortium (OGC)Tags: OGC API, ogcapi, Esri, Principal Member, impactContributed by: Adam Martin, Esri, Jonathan Fath, OGCOpen standards aren’t just about efficiency. They allow organizations across the globe to share information effectively and securely, and can provide much needed security for data. Standards provide governments and industry alike the ability to use tons of data for a range of use cases from citizen science to Defense and Intelligence and disaster relief.
Esri’s ArcGIS implementation of OGC API - Features is a strong example of how standards can be used effectively with almost unlimited potential. It’s an open, interoperable system that drives efficiency and innovation. Like many platforms of its kind, it addresses a wide range of use cases, but also embraces interoperability through the use of open standards.
“ArcGIS products enable and amplify FAIR data principles - making our customers’ location data Findable, Accessible, Interoperable, and Reusable,” said Adam Martin, Esri product manager. “Supporting standards-based interoperability, including the new OGC API suite, is a key pillar of our product strategy and offering.”
To demonstrate ArcGIS’s recent support for OGC APIs, Esri’s Living Atlas program published a collection of U.S. National Geospatial Data Assets (NGDAs) as OGC API - Features. These NGDAs represent foundational data for the US, such as physical infrastructure, rivers and administrative boundaries, and are designated by the US Federal Geographic Data Committee. This collection is among the 8,000 datasets, maps and apps curated by the ArcGIS Living Atlas program that are of critical importance to Esri’s millions of global users.
“We are excited about this new generation of geospatial APIs and look forward to our customers using these new national foundational data services,” said Adam. “We also look forward to seeing our customers with authoritative data publish their own OGC API - Feature services using ArcGIS Online and providing feedback on both experiences.”
OGC API - Features is a multi-part standard that offers the capability to create, modify, and query spatial data on the Web and specifies requirements and recommendations for APIs that want to follow a standard way of sharing feature data. Since feature data is essentially an object with location and other geographic properties, this creates endless possibilities. By implementing OGC API - Features, ArcGIS becomes exponentially more versatile in the information that it creates and easily shares.
OGC APIs span well beyond features. Ranging from maps, tiles, and styles to routes and other crucial forms of geospatial data, they are the building blocks for location and the next generation of the consortium’s standards. The building blocks are defined not only by the requirements of the specific standards, but also through interoperability prototyping and testing in OGC's Innovation Program, a forum for OGC members to solve difficult geospatial challenges via a collaborative and agile process that is tackling R&D in initiatives such as climate, disasters, defense, and serious gaming.
“Esri has worked within OGC for decades and will continue to support OGC standards important to our users,” said Adam of Esri’s commitment to the OGC Innovation program. “User feedback is critical for our ongoing development efforts to support this suite of OGC APIs as they mature through the consensus process.”
If you’re interested in using ArcGIS products for a project, see this short Github tutorial that shows a user how to connect with ArcGIS Pro to an API that implements OGC API - Features - Part 1: Core. Developers can also use OGC API - Features with the ArcGIS JS API and Runtime SDKs. You can also find the landing page for an example OGC API - Features service here, along with the curated US NGDA collection here. If you’d like to participate in an OGC APIs sprint, developers are always welcome to attend, and they are free to the public. Sprints happen quarterly and are essential to helping finalize these much needed standards.
Users can contact Esri’s product management team responsible for implementing these open standards at open [at] esri.com.
-
19:16
Paving the way forward for Building Energy Mapping and Analytics
sur Open Geospatial Consortium (OGC)Tags: Natural Resources Canada (NRCAN), Building Energy Mapping, Climate, SDIContributed by: Eddie Oldfield, Senior Lead, Projects & Advisory Services, QUEST; Jessica Webster, Energy Planning Analyst, Natural Resources Canada; and Ryan Ahola, Environmental Scientist, Natural Resources CanadaIntroduction - The Challenge
Building energy mapping and analysis are critical for geo-targeting energy policies and programs to accelerate the transition to a low-carbon built environment and economy. Efforts to map energy use and greenhouse gas emissions from buildings are undertaken by Canadian municipalities for energy and emissions planning purposes, supported by consulting firms, universities, and sometimes non-profit organizations. Energy mapping projects are conducted independently at different times, across different scales, and using different methods and assumptions. Yet fundamentally, the data are the same: what’s required is an understanding of the building stock and its energy-related attributes including the number of buildings and units, their respective floor areas, as well as measured historical energy use and modelled predicted energy use based on different housing or building types (known as archetypes). Despite this commonality and everyone’s best efforts, there is little coordination across initiatives and no best practices or standards widely in use. This results in duplication of effort, lost energy savings, and lost opportunities for decarbonization, climate change mitigation, and climate resilience.
The Building Energy Mapping and Analytics Concept Development Study (BEMA-CDS) addressed the challenge posed by this situation by:
- Characterizing the state of development of energy mapping and analytics for the building stock broadly; and
- Informing IT architectural practices and standards to enable mapping and analytics specifically of residential energy use and efficiency.
Initiated in December 2019, with support from Natural Resources Canada (NRCan), the study drew from a number of information sources, including past research and public consultations, relevant legislation, and ongoing related initiatives. It then developed and, in February 2020, publicly released a Request for Information (RFI) that solicited responses from a wide audience of stakeholders and organizations. Questions were posed in eight subject categories concerning the building energy mapping and analytics domain.
It targeted three principal scenarios for development and application of building energy analytics and mapping:
- Community Energy and Emissions Planning
- Utility Conservation Potential Review & Demand-Side Management Program Planning
- Federal/Provincial/Territorial Building Energy - Policies, Programs, Standards, Building Codes
A series of webinars were held in mid-2020 with RFI respondents and OGC Energy and Utility Domain Working Group members to review and workshop the responses provided, to arrive at a refined understanding of current practice, and provide input to the notional architecture.
Who Can Benefit from this Study?
Oak Ridge National Laboratory’s online software suite, AutoBEM, is digital twin of the US's 129 million buildings that provides an energy model for utilities and owners to make informed decisions on how to best improve energy efficiency. Credit: ORNL, U.S. Dept. of Energy [click to enlarge]The report documenting the BEMA-CDS focuses on issues surrounding data sharing and spatial data interoperability that currently stand in the way of more fully achieving the goals and value of building energy analysis. This valuable perspective benefits many stakeholders and programs, including:
- Building scientists and energy researchers
- Suggests paths to improved data interoperability, better models, increased coordination
- Identifies potential approaches for reducing duplication, time, and costs across organizations
- Supports better quality control, and comparable data for planning and program evaluation
- Government policy analysts, regulatory authorities, and building codes and standards committees
- Identifies new approaches to inform national and provincial housing retrofit incentive programs
- Anticipates data interoperability challenges and opportunities around Alterations Codes for existing buildings
- Community energy planners
- Municipal energy planning, including design and delivery of housing efficiency programs
- A geospatial view offers the possibility of improved coordination with utilities through a common operating picture
- Utility demand-side management program managers
- Anticipates need for more geospatial analysis as more renewables come online; capital cost offsetting
- Points to “behind the meter” methods that could improve uptake for conservation and demand management (energy efficiency) programs
- The OGC Energy and Utilities Domain Working Group (DWG)
- Supports identification of potential further R&D and standards development activities beyond the timeframe of the BEMA-CDS study, for example those that address the evaluation of decarbonization strategies.
This emerging discipline sits at the convergence of many domains and areas of professional knowledge including building science, geospatial science, data science, urban planning, and energy planning. Consequently, beyond the priority usage scenarios and specific stakeholders identified above, the BEMA-CDS will also be of interest to anyone working to advance smart cities, urban digital twins, building stock energy modelling, and/or the smart grid (sometimes referred to as the digital grid). An emerging cleantech segment known as climate tech - cleantech companies tackling climate change specifically - will also be interested if their solutions relate to energy and buildings, as will venture capitalists seeking to invest in climate tech firms. Banks, who increasingly view climate risk as lending risk, should be interested in geospatial approaches to quantify the carbon-intensity of their mortgage portfolios and support assessment of lending products for energy efficiency and renewable energy technology deployment.
Some Key FindingsA critical challenge and a need identified in this study is the availability of the right spatial information elements to perform building energy analysis at various levels of generalization and specificity to improve lives and advance community goals. Across building energy mapping efforts, repetitive and non-standardized methods are used to collect, exchange, and integrate datasets. Some notable examples of persistent mapping undertaken at regional or national scales include the CityGML work in Berlin, ORNL’s digital twin AutoBEM, and the UCLA Energy Atlas. The current ad hoc approach to data collection, integration, and re-use is terribly inefficient in face of the current climate crisis.
The idea of supporting reusability and sharing of spatial data by building information as infrastructure has been around for many years under the concept of Spatial Data Infrastructures (SDI). Canada has a highly developed SDI, the Canadian Geospatial Data Infrastructure (CGDI), which uses a distributed model to support access, sharing, and use of diverse spatial information. CGDI provides critical infrastructure that Canadians rely on every day, such as weather forecasts produced by Environment and Climate Change Canada. CGDI also supports future-oriented research by allowing scientists to integrate many different forms of information through location, such as within climatedata.ca.
More recently, the concept, capabilities, and design of such infrastructure have been expanding in the age of cloud computing. It makes sense in this context to consider what an “energy Spatial Data Infrastructure” (eSDI) might look like that can support diverse building energy data needs, opportunities, stakeholders, and goals identified in the report.
Challenges IdentifiedOther common challenges enumerated by RFI respondents - and later reaffirmed by workshop participants - related to data availability, privacy, and confidentiality, as well as considerations concerning proprietary formats. Data source methods and confidence were found to be wide-ranging and poorly documented, variously measured, modeled, inferred, estimated, and assumed. A lack of access to cost estimates for retrofits was identified by respondents as a barrier to deriving benefits from energy mapping and modeling data. From a data infrastructure and reusability perspective, the lack of an overarching data framework prevents connecting the scale and resolution of spatial data to particular use scenarios. Similarly, there are no accepted schemas for applying different archetyping approaches (clustering/classification) to different use-case scenarios and levels of organizational technical and financial capacity.
Opportunities IdentifiedDespite these and other challenges, numerous opportunities were identified in the study, including data access technologies that account for privacy, confidentiality, and anonymity. Promising techniques include enclave processing, anonymization by aggregation, and noise injection, sometimes referred to as differential privacy. Adaptive classification and archetyping based on sample modeling is another potential approach to fit archetyping needs to use-cases using available data. The development of national systems for consistent energy data at multiple spatio-temporal scales is identified as an opportunity that could serve a range of use-cases. National building data for comprehensive analysis of building types, energy performance, retrofit/upgrade technologies, costs, and benefits was another data-related opportunity identified. In support of greater data interoperability, opportunities around data sharing policies and standards can be organized to support critical use-cases and stakeholders, for example mandated reporting and federated contracts. Community/utility cooperation facilitated by regional or national authorities may allow stakeholders to better understand opportunities, costs, and benefits of new technologies and energy sources including renewables.
Notional Architecture of Energy SDIOne main output of the BEMA-CDS is what’s referred to as a “notional architecture for an eSDI.” Reflecting the architecture of all spatial data infrastructures, it is organized into broad categories, also known as tiers. On the below diagram, these can be seen on the left-hand side, starting at the bottom with data and moving up to computing, services, and applications. This architecture follows the evolution of information, from data through processing to decision support. Each line to the right of the four tiers contains generic packages that apply to the energy and buildings domain. At the top of the architecture, applications synthesizing and presenting the resulting information are shown to potentially fulfil a range of auditing, program, policy, educational, and commercial decision-support functions.
A notional architecture for an energy Spatial Data Infrastructure (eSDI)Further to the notional architecture, the below diagram illustrates the current context in Canada. Existing data sources, standards, and applications are not fully interoperable with current energy modelling, benchmarking, and labelling platforms in common use. These platforms use and collect data that is spatially implicit, that is having spatial attributes such as address, city, or weather region. However, the full power of spatial data interoperability, mapping, and spatial data analytics is not fully architected, operational, or available to Canadian energy decision makers.
The study and notional architecture help to inform conversations about the potential value and opportunities to undertake further eSDI architecture development - in Canada and abroad. For example, some elements such as hierarchical, relational, and semantic schemas have yet to be developed.
Next Steps
Components of a Canadian eSDI exist but are not fully architected or interoperableAmong the many issues raised, the below learning opportunities and potential next steps stand out:
- Design of an extensible and standardized national building layer, leading to both national application and improved comparability of promising building energy analysis methods.
- Sandbox activities such as interoperability pilots, modeling the mutual benefits of information sharing and data interoperability.
- Prototypes for an eSDI, demonstrating common availability of such technologies as cloud-based energy modeling, model-driven building archetypes, and enclave protocols for addressing data privacy and proprietary constraints.
- Development of energy poverty indices that take into account fine-scale socio-economic, climate, and geographic factors in assessing the impacts and mitigation of building energy costs.
These activities can take the form of data development initiatives and interoperability experiments, which in turn can contribute to standards development. In any of these activities, cross-cutting themes can be explored and elaborated, including matching spatial-temporal resolution of input and output data, and archetyping methods matching to use-cases.
ConclusionThe BEMA-CDS study enumerates the current state of practice and identifies challenges and opportunities in building energy mapping and analytics. It also sketches out for the first time a notional architecture for an energy Spatial Data Infrastructure. Shifting effort and resources from a mindset of “let’s just get it done for this project” to one of “build it, maintain it, and continuously improve it” would produce efficiencies, improve the quality and timeliness of decision support, and accelerate innovation and job creation in the domain – all on top of cost savings and a reduction in GHG emissions. Urban digital twins, such as AutoBEM, developed and maintained by Oak Ridge National Laboratory, or Virtual Singapore, developed by the National Research Foundation, are leading examples of what can be accomplished with this mindset and current technology.
The Building Energy Mapping and Analytics: Concept Development Study Report was recently published and is freely available on the Building Energy Mapping and Analytics Concept Development Study (BEMA-CDS) page.
The authors welcome any questions concerning the study or report:
- Eddie Oldfield (Senior Lead, Projects & Advisory Services, QUEST) eoldfield [at] questcanada.org
- Jessica Webster (Energy Planning Analyst, Natural Resources Canada) jessica.webster [at] nrcan-rncan.gc.ca
- Ryan Ahola (Environmental Scientist, Natural Resources Canada) ryan.ahola [at] nrcan-rncan.gc.ca
-
11:32
A User-centric Approach to Data Cubes
sur Open Geospatial Consortium (OGC)Tags: geoconnexion, data cubeA version of this article originally appeared in the July/August 2021 edition of GeoConnexion International Magazine.
Geospatial data cubes are used frequently these days for their enabling of performant, cloud-compatible geospatial data access and analysis. But differences in their design, interfaces, and handling of temporal characteristics are causing interoperability challenges for anyone interacting with more than one solution. Such challenges are unnecessarily wasting time and money, and - from a science perspective - affecting reproducibility.
To address these challenges, the Open Geospatial Consortium (OGC) and the Group on Earth Observation (GEO) invited global data cube experts to discuss the “state of the art” and find a way forward at the Towards Data Cube Interoperability workshop. The two-day workshop, conducted in late April 2021, started with a series of pre-recorded position statements by data cube providers and data cube users. These videos served as the entry point for intense discussions that not only produced a new definition of the term ‘data cube’, but also underscored the need for a ‘user centric’ API-based approach that exposes not only the data available to the user, but also the processing algorithms that can be run on it - and allow the user to add their own. The outcomes of the Workshop have been published on the OGC & GEO Towards Data Cube Interoperability Workshop webpage.
Data cubes from the users’ perspective
Data cubes are ideally suited to cloud-based workflows, but a lack of standards makes integration of different data cubes a challenge.Existing definitions of data cubes often focus on the data structure aspect as used in computer science. In contrast to this, the Towards Data Cube Interoperability workshop emphasized the need to leave these definitions behind and focus on the user’s perspective. Users don’t care if the data is stored in a relational database, in a cloud-based object store, or a file server. What users are interested in is how they can access the data and the processing algorithms that they can apply to it. Any such standard for access should reflect this.
This led to an interesting rethinking of just what a data cube is and can be. Although it wasn’t agreed to on any formal consensus-basis, the workshop participants generally took a user-centric definition of a geo data cube to be:
“A geo data cube is a discretized model of the earth that offers the estimated values of certain variables for each cell. Ideally, a data cube is dense (i.e., does not include empty cells) with constant cell distance for its spatial and temporal dimensions. A data cube describes its basic structure, i.e., its spatial and temporal characteristics and its supported variables (aka properties), as metadata. It is further defined by a set of functions. These functions describe the available discovery, access, view, analysis, and processing methods by which the user can interact with the data cube.”
As we see, the data cube is described for the user, not the data. It does not matter if the data cube contains one, two, or three spatial dimensions, or if time is given its own dimension(s) or is just part of the metadata of an observation - or isn’t relevant to the data at all. Similarly, it doesn’t matter how the data is stored. What will unify these heterogeneous data cubes is their use of a standardised [HTTP-based] API as their method of access and interaction.
The main concern of the user is what functions the data cube instance offers to apply to the data. These functions are what primarily differentiate the user-centric data cube definition over other definitions. A user needs to understand what questions can be asked to access data that fulfills specific filter criteria, how to visualize specific (sub-) sets of data, or how to execute analytical functions and other processes on the data cube. If supported, the user also needs to understand how to add their own processes to the data cube so that they can be executed directly on the data cube without the need to transfer vast amounts of data out of the cloud.
This isn’t to say that all other characteristics - such as spatial and temporal details (e.g., being dense or sparse, overlapping or perfectly aligned, constant or inconstant distances), and property details (scales of measurements, incomplete data, interpolation methods, error values, etc.) - are of no concern to the user: they still need to be known. As such, they will be provided via the data cube API as metadata, so that the user can take them into account when assessing how best to process the data.
Interoperability through a Data Cube API
Integrating different data cubes isn’t an unsolvable puzzle.Where does this leave OGC? We think an API-based, flexible approach to standards will provide end users, software developers, and data cube operators with the best experience.
For end users: a single, simple, standardised HTTP API to learn and/or code for, no matter where the data resides, will mean an increased selection of available software (including low- or no-code platforms) will support an increased choice of data cube providers and an increased number of processing algorithms. From a scientific perspective, this means that the atmospheric scientist doesn’t additionally have to also be a Python expert, potentially using a low- or no-code platform GUI to create an algorithm that processes the data for their heatwave study across Germany. Another atmospheric scientist could then take that same processing algorithm and apply it to the UK with minimal changes - even if the required data is held by a different standards-compliant data provider. This approach greatly increases the transparency and repeatability of scientific studies and other valuable analysis tasks.
For software developers: a single, simple, standardised HTTP API means that software developers don’t have to design their own vendor-specific methods for providing access to data cubes in their software. Instead, they interact with data cubes via HTTP calls, thus benefiting from simple standard Web communication, rather than interactions on the programmatic level. By coding to an agreed-upon standard, developers can work with any compliant data cube while minimizing cube-specific adaptations. This increases the usability of the software, while decreasing the development and maintenance costs.
For data cube operators: using a single, simple, standardised HTTP API reduces development and maintenance costs while broadening the customer base. Being standards-compliant allows providers to access customers that are using any compliant software package, rather than just those using a select list of software coded to work with your specific instance. This means that more people will be coding for your data cube, even if they don’t know your service exists.
What’s next for OGC?
Data cubes come in many different shapes and sizes - a standard API would simplify their use.It’s early days yet, but you can expect to see a data cube-related API become part of our family of OGC API standards. Work towards such a data cube API builds upon the work of our Earth Observation Exploitation Platform (see An App Store For Big Data, in GeoConnexion International, July/August 2020), and is currently underway as part of OGC Testbed-17.
If you’re interested in learning about OGC’s approach to standardising access to data cubes, OGC Members can follow their early development as Observers in OGC’s Testbed-17. Alternatively, OGC Members can join the Earth Observation Exploitation Platform Domain Working Group. Detailed outcomes from the Workshop are available on the OGC & GEO Towards Data Cube Interoperability Workshop webpage.
-
18:58
Major revision of the Geospatial Information Management Standards Guide endorsed by United Nations member nations
sur Open Geospatial Consortium (OGC)Contributed by: Mark Reichardt
At the Eleventh Session of the United Nations Global Geospatial Information Management (UN-GGIM) Committee of Experts, member nations endorsed a key revision of the UN-GGIM Guide to the Role of Standards in Geospatial Information Management.Happening in late August 2021, the goal of the Guide is to “provide detailed insights on the standards and good practices necessary to establish and maintain geospatial information management systems that are compatible and interoperable with other systems within and across organizations. The Guide also underscores the importance of standards in facilitating the application of the FAIR (Findable, Accessible, Interoperable, and Reusable) data principles - promoting improved policymaking, decision making and government effectiveness in addressing key social, economic, and environmental topics, including attainment of Sustainable Development Goals”.
This endorsement represents the culmination of the work of team of over 35 members and staff of the three Standards Development Organizations (SDOs): the Open Geospatial Consortium, the International Organization for Standardization (ISO) Technical Committee 211 on Geographic Information/Geomatics, (ISO/TC 211), and the International Hydrographic Organization (IHO). The team began its 6-month revision effort in January 2021.
This revision process to the UN-GGIM Guide to the Role of Standards in Geospatial Information Management has several key goals:
Update the guide to represent recent advancements in geospatial standards, reinforcing learning resources and community implementation examples
Align the Guide with the UN-GGIM Integrated Information Framework (IGIF) – the overarching strategy and guide for implementing geospatial information management in nations worldwide, and
Transition the Guide from a traditional static publication to an easy to maintain web presence, while providing the ability for users to create a static, printed version of the document for offline use.
Committee of Experts representatives from member nations and observer organizations committed to review and comment on the revised Guide, to include identification of additional resources and community implementation examples to help implementers better understand the context and value of standards as an underpinning component of geospatial information management programs. Such resources will further help implementing organizations to establish solutions that “interoperate” to support geospatial data sharing, maintenance and decision-making across organizations, jurisdictions and systems. The SDO Guide team expects to have the Guide available as an on-line resource by January of 2022.
OGC, ISO/TC 211 and IHO member representatives and staff have dedicated their time and energy to this effort, and OGC is proud of this long standing collaboration and our commitment to supporting UN-GGIM to enable FAIR geospatial information management worldwide.
For more information about the Guide including sponsorship opportunities to help defray the costs of implementation and maintenance of the on-line Guide, contact the SDO team via email at: UNStdsGuideComments [at] lists.ogc.org.
-
22:13
Three reasons why New Space is valuable to the location community - and vice-versa
sur Open Geospatial Consortium (OGC)“Everybody has to have an interest in solving global problems. Unless one has entirely lost touch with reality. - And such people do exist.” Dennis Snower, “Who We Were”
Powering solutions that address global problems is one of the drivers behind OGC’s efforts to simplify data integration, and New Space, as an emerging domain, is simultaneously offering exciting solutions while creating integration challenges. The topic remains a point of discussion across the location community, as well as at OGC Member Meetings, often revealing many questions.
So, to help those that are not too familiar with the concept of New Space, I shall answer three common questions that help illustrate what New Space is, why it matters, and how it is evolving technologies and standards alike.
If you’re interested in anything New Space, you’re encouraged to attend our next (virtual) Member Meeting, the week of September 13, 2021. In particular, come along to our first New Space Summit - an event that will highlight the importance of the domain and why technology and standards matter. Registration is available at meet.ogc.org.
What is New Space and why is it so important to the Geospatial Community?‘New Space’ is a paradigm driven by a combination of technology and market advances such as rocket launches, small satellites, orbital planes, evolving sensors, and ground infrastructures. While the ‘commercialization’ of space isn’t a new occurrence, it is only fairly recently that space technologies have become accessible enough that the market has really ‘taken off’ (if you’ll excuse the pun), resulting in a proliferation of players, services, methodologies, and technologies.
This leads to a core challenge: the FAIRness of data, information, and other derivatives. FAIRness in this case is the foundation and the enabler for the full, efficient, and sustainable exploitation of New Space: how can one Find relevant data and information? Is it easily Accessible? Can it Interoperate with existing datasets and systems? And is it, and any derived products, Re-usable by others?
These concerns are not trivial: recent years have shown that our species as a whole needs to address critical global challenges, such as our impact on the climate and environment, more frequent natural disasters (including pandemics), compromised food security, and more. New Space technologies, with their inherently global perspective, will play a valuable role by providing much of the data needed to address these challenges.
Similarly, the New Space community needs to address smaller-scale problems of space debris as well as judge the benefits and costs of space exploration - to name just a few. There are many questions related to New Space, and as the technologies evolve, so too does the list of related challenges.
With this in mind, the location community needs an accessible, informed forum to discuss and better understand the impacts - positive and negative - of New Space on standards, data integration, and therefore effective decision making.
OGC, with its membership containing the full spectrum of experts - from designers, to providers, to end-users of New Space and related technologies - is ideally suited for just such a forum. This full spectrum perspective provides invaluable viewpoints on the practical considerations required to design useful standards, refine best practices, create valuable partnerships, and research & develop new technologies during Innovation Initiatives.
These sorts of discussions will occur at the New Space summit, part of OGC’s 120th Member Meeting, on Wednesday September 15. Register now at meet.ogc.org.
How are problems in New Space being addressed by OGC and the greater location community?OGC members lead the exploitation of new space technologies, data and solutions by:
-
Running R&D initiatives under the OGC Innovation Program.
-
Developing standards, such as OGC APIs, as well as best practices that help make the data generated by New Space technologies align with the FAIR data principles.
-
Bridging, building, linking, and involving a global community of space data providers, users, and integrators.
Leadership through the OGC Innovation Program
The OGC Innovation Program enables OGC members to solve the latest and hardest geospatial challenges via an agile, collaborative process. OGC members (sponsors and technology implementers) from across the location community come together to solve problems, produce prototypes, develop demonstrations, provide best practices, and advance the future of standards.
Recently, OGC Innovation has developed, demonstrated, and documented a large number of open standards-based technologies that address some of the challenges faced by organizations across the New Space domain, including:
-
Software architectures that allow the execution of data processing applications on the same infrastructure hosting the data (‘application to the data’ principle), minimizing data transport costs.
-
Discovery and access interfaces to optimize data handling, through our OGC APIs.
-
Data cubes to store, transport, and access multi-dimensional data efficiently.
-
Linked data approaches that help to achieve a higher level of interoperability by providing additional machine-readable information about the data.
Currently, the Innovation Program is exploring the use of New Space data in the context of natural disasters. In the current OGC Disaster Pilot 2021, more than 25 participating organisations are exploring the use of hybrid scalable cloud-based systems with advanced AI processing, machine learning algorithms, and simulation models working where earth observation and other data is already uplinked, prepared, and curated, so that they can generate analysis-ready situational data with the characteristics, scale, and speed required in the wake of a natural disaster, such as a landslide, flood, or pandemic.
Impact international Standards and Best Practice Setting
Organisations across the globe use OGC APIs and other Standards to power their applications and solutions. By engaging with the OGC’s Standards Program organizations can stay on top of current technology trends and better understand interoperability needs and requirements to unlock the full potential of New Space’s and other Earth Observation data.
Examples of OGC standards and working groups relevant and used in the Earth Observation domain are:
-
GeoAI DWG - a forward looking group bringing order to chaos on a disruptive technology.
-
Coverages SWG - it’s all about EO data cube management and analysis, connecting better data management approaches to produce analysis ready data.
-
Discrete Global Grids Systems (DGGS) DWG - these data repositories on national, continental and global scale advance the management and linkages to very large multi-resolution and multi-domain datasets. They are enabling the next generation of analytic processes to be applied to sensors, data type or coordinate reference system.
-
OGC API - Maps, - Processes, - Records, - Tiles, and the Sensorthings API - Virtually all types of working imagery involving Earth Observation data. This is the next generation of standards - data-centric versus web-service-centric, intended to simplify their implementation, discovery, and use.
-
EO Product Metadata and OpenSearch SWG - we work here to improve the findability and accessibility of Earth Observation data.
-
Sensor Web Enablement DWG - it’s about all levels of sensor sophistication, including those used on EO platforms. This group works to find best practices to better connect Space/Sky/Surface Sensor data interoperability.
OGC working groups meet up during most quarterly OGC Member Meetings to discuss developments since the previous meeting, and actions for the next. Many sessions are open to non-members, so attendance is encouraged by OGC Members and non-members alike.
How can the location community best innovate using New Space technologies?As communities of experts and technologies focused on information gathered from space continue to grow, so do the opportunities and use cases for collaborating, scaling, and sharing across the New Space domain. With this in mind, OGC and our members are committed to sharing and learning from existing knowledge, discovering new, shared interests and initiatives, and creating meaningful impacts.
Conversations concerning New Space seem to grow at every OGC Member Meeting. As such, we’re very excited to host, at our 120th Member Meeting the week of September 13, 2021, a dedicated New Space Summit. We are looking for organizations to get involved, so please register, or get in touch to learn more.
OGC is also always looking for individuals to join our many Domain and Standards Working groups to help continue to drive the conversation forward. Reach out to a teammate now to learn more about becoming a member of our global community of experts.
Registration for OGC’s 120th Member Meeting, co-located with Singapore Geospatial Festival 2021, including the New Space Summit, is available at meet.ogc.org. OGC Members and non-members are encouraged to attend.
-