Vous pouvez lire le billet sur le blog La Minute pour plus d'informations sur les RSS !
Feeds
9871 items (2 unread) in 52 feeds
-
Décryptagéo, l'information géographique
-
Cybergeo
-
Revue Internationale de Géomatique (RIG)
-
SIGMAG & SIGTV.FR - Un autre regard sur la géomatique
-
Mappemonde
-
Imagerie Géospatiale
-
Toute l’actualité des Geoservices de l'IGN
-
arcOrama, un blog sur les SIG, ceux d ESRI en particulier
-
arcOpole - Actualités du Programme
-
Géoclip, le générateur d'observatoires cartographiques
-
Blog GEOCONCEPT FR

-
Géoblogs (GeoRezo.net)
-
Geotribu (1 unread)
-
Les cafés géographiques (1 unread)
-
UrbaLine (le blog d'Aline sur l'urba, la géomatique, et l'habitat)
-
Séries temporelles (CESBIO)
-
Datafoncier, données pour les territoires (Cerema)
-
Cartes et figures du monde
-
SIGEA: actualités des SIG pour l'enseignement agricole
-
Data and GIS tips
-
Neogeo Technologies
-
Oslandia
-
ReLucBlog
-
L'Atelier de Cartographie
-
My Geomatic
-
archeomatic (le blog d'un archéologue à l’INRAP)
-
Makina Corpus
-
Camptocamp
-
Veille cartographie
-
Carnet (neo)cartographique
-
Le blog de Geomatys
-
GEOMATIQUE
-
Geomatick
-
CartONG (actualités)
Open Geospatial Consortium (OGC)
-
17:23
Principled and Powerful: Geospatial data for ESG reporting at Location Powers 2022
sur Open Geospatial Consortium (OGC)Tags: location powers, ESGThere was a principled and powerful representation in attendance at OGC’s Location Powers event in London last November. Ordnance Survey hosted the event at the Geovation Hub with 23 speakers over two days. Representatives from across finance, transport/logistics, and construction/maintenance markets described their organizations’ approach to Environmental, Social, and (corporate) Governance (ESG) reporting and how it benefits from geospatial technologies. The event highlighted the need for sharing location information easily to many stakeholders.
“This Location Powers event has been eye-opening in understanding how financial experts and geospatial practitioners can look at the same data in very different ways,” said Scott Simmons, OGC’s Chief Standards Officer. “Working from the perspective of each will help us identify the right targets for standardization and development of best practices in the use of geospatial data to support accurate and comparable ESG metrics.”
While nation states are working towards their United Nations (UN) Sustainable Development Goals for 2030, corporations are being incentivised to work across international borders and throughout their supply chain in an effort to reduce carbon emissions and mitigate and adapt to climate change risks. Self-regulation has not been effective and new EU legislation, due in June 2023, could change the economic landscape not just in Europe, but globally. Geospatial’s role will be crucial in this effort.
Allan Jamieson, Data Standards and Governance, Ordnance Survey (OS) opens Location Powers 2022.A corporation’s voluntary self-assessment is the current source of most ESG reports, as stand-alone documents or as provided with their annual report. The information being supplied can be variable, to differing standards, overly-optimistic, or even take a cynical approach that ‘green-washes’ their actions. For international corporations, being able to reassure stakeholders and investors that their business model can extend into the future requires more diverse data to be included in ESG reporting.
For each of the data topics – environment, social, and (corporate) governance – there is a location aspect. Environmental data can be derived from Earth Observation (EO) and remote sensing sources. Social data can be sourced from census, social media, and on-the-ground news reports. And both the Environmental and Social aspects are directly reflected in governance. What regulatory authorities and other organizations are searching for is authoritative, consistent, and reliable data from which they can verify their own datasets against the validity of corporations and their ESG reports.
Read on to learn what was discussed, or access recordings and slides from the event on the OGC Location Powers 2022 website.
Two Days of InsightsThe Location Powers provided attendees with two days of focused content: Day 1 covered the challenges of the various industry sectors and Day 2 highlighted the geospatial sector’s response to those challenges. Each day included expert presentations with Q&A, breakout sessions on various sector topics, and an end-of-day expert panel to share outcomes and insights from the breakouts. There were plenty of opportunities to network with other attendees and speakers during the breaks, over an enjoyable lunch, and an evening drinks reception.
Day OneAfter introductions, Richard Peers, Founder, ResponsibleRisk kicked off the event with a detailed overview of ESG and the various standards available in the context of COP27 including the Task Force on Climate-Related Financial Disclosures (TCFD) that serves to mitigate, not just deal with, outcomes, and the Transition Framework for Nature (TNFD) to describe the governance and strategy.
Yasmin Raza, ESG Market Intelligence & Engagement Team Manager, Financial Conduct Authority (FCA) spoke specifically about how the finance sector in the UK plans to move to become the NetZero world capital of finance. The FCA will introduce three key labels to help consumers distinguish between financial products: Sustainable Focus, Sustainable Improvers, Sustainable Impact.
David J. Patterson, Head of Conservation Intelligence, World Wide Fund for Nature (WWF-UK) highlighted the difficulties in accessing up-to-date data on biodiversity at a suitable resolution for analysis when accuracy of data sets was hard to determine.
Andrew Coote, Director, ConsultingWhere described the implementation of the UN Integrated Geospatial Information Framework (IGIF) aimed at developing nations and how the UN was supporting them to move from country-level guidance to strategic localized action plans.
Franca Wolf, Senior Analyst, Verisk Maplecroft expanded on this theme further by talking about the geospatial risk exposure to climate change for investors. Corporations want to be able to forecast ahead and anticipate their ESG rating to mitigate their exposure to future risks.
Scott James, Partner, Ward Williams Associates (WWA) brought us back to the ESG reporting reality for most businesses: how do you find the data you are looking for and how can you document the good that you are doing? He described the WWA journey through the lens of becoming B-Corp Certified and the overall benefit of attracting new talent and customers.
Daniel Barlow, Innovation Policy at British Standards Institution (BSI) presented remotely as he dialed in from COP27 to relay the new ISO NetZero Guidelines for industry.
A panel discussion on “Future trends in ESG business models from banking, logistics, and construction” included: Michael Groves, CEO, Topolytics; Mariam Crichton, CEO, 7 Satya; Jen Dixon, Business Analyst & Ethics Advisory Group, Esri UK; and was hosted by Donna Lyndsay, Strategic Market Lead – Environment and Sustainability, Ordnance Survey.
The panelists raised topics on ethics in data collection, human rights, geospatial data not being instant, and the cost and reduction of energy consumption. The most important discussion was regarding the question: “does the C-Suite even care about ESGs? And how do we get them to care, especially when this may highlight uncomfortable human rights practices in the supply chain.”
Conversation went on to discuss how and when to learn from ethicists, and even ventured into the role that western and eastern philosophies could play. The role of ESG reporting has been voluntary so far, but international corporations are soon to be brought under regulatory security, which will require a change in mindset to adapt to new market requirements.
Day Two
(L-R) Donna Lyndsay hosts the Day 1 panel consisting of Michael Groves, Mariam Crichton and Jen Dixon at OGC Location Powers 2022.The first speaker for the second day was Ed Parsons, Google’s Geographer. Ed talked about how the geospatial sector is responding to the sheer volume of geospatial data being collected that needs to be evaluated and audited. Google Maps now offers route options based on releasing fewer GHG emissions and better fuel efficiency. Google Environmental Insights Explorer offers Machine Learning (ML) capabilities to city and regional authorities.
Simon Casey, Channel Sales Manager, Satellite Vu brought to our attention the new developments in Thermal Infrared (TIR) sensors in terms of resolution and high (10-20) revisits per day, and how it has enabled the detection of small zones releasing excessive thermal energy, such as vessels or wildfires.
Scott Simmons, Chief Standards Officer, OGC, compared and contrasted the top ESG rating agencies and their ESG indicator definitions. He engaged the participants to discuss what data that contributes to ESG ratings could or should be standardized and which is outside the realm of geospatial standards.
Mattie Yeta, Chief Sustainability Officer, CGI assessed the ESG commitments of her organization to develop their own system of ESG reporting, including data centers, travel, and natural capital. CGI is working on a Sustainability Exploration Environmental Data Science (SEEDS) with the UN.
Ali Nicholl, Founder & Engagement, IOTICS raised the big ESG data challenge that needs to be resolved by collaboration and cooperation through digital ecosystems.
Allan Jamieson, Data Standards and Governance, Ordnance Survey (OS) highlighted the importance of authoritative data, and building trust for the validation of ESG reporting.
Olive Powell, Head of Geography & Geospatial and Charlie Dacke, Head of Geospatial Technology & Standards, Office for National Statistics (ONS) presented the challenges of providing statistics for the public good while maintaining the privacy of individuals by using the Reference Data Management Framework (RDMF).
Andrea Santiago, Subdirector, National Institute of Statistics and Geography of Mexico (INEGI), joined us remotely to present on using the Locus Charter to balance privacy and analytic power.
Group discussion from breakout sessions and closing remarks hosted by Scott Simmons focused on next steps. The panel consisted of: David Philp, Director - Digital Consulting, Strategy & Innovation - Europe, Digital AECOM; Marzia Bolpagni, Head of BIM International - Associate Director, MACE; Oliver Morris, Account Manager, Tensing; and Ian Prentice, Business Development Manager, Carto.
The panelists noted that, when it comes to customers finding and choosing datasets or processing models, there is a competitive advantage for geospatial businesses in making a range of tasks self-service. The Geospatial community has the technology and data, but needs to convey the trustworthiness of these data and services by making methodology more transparent. Fortunately, there is an increasing level of trust in publicly-released datasets and consideration should be made about use of the Gemini Principles as a good way to build trust.
As Allan Jamieson, Data Standards and Governance, Ordnance Survey (OS), said during the event: “By providing authoritative geospatial data, trust will be built in the validation of ESG reporting.”
(L-R) Scott Simmons hosts the Day 2 panel consisting of David Philp, Marzia Bolpagni, Oliver Morris, and Ian Prentice.All slides and recordings from OGC Location Powers on ESG Reporting are publicly available on the OGC Location Powers 2022 website. The full results of the OGC Location Powers event will be released as a white paper in Q1 2023.
-
14:49
Building the Building Blocks for the Future of Location: The November 2022 OGC Web Mapping Code Sprint
sur Open Geospatial Consortium (OGC)The mechanisms through which maps are delivered across the Internet have evolved significantly over the past two decades. Advancement of such mechanisms has been driven by a combination of factors. New data formats have emerged, the SWaP-C (size, weight, power, and cost) of devices has improved, and the capabilities of web browsers have been enhanced by improvements brought by HTML5. This means that some of the functionality that web mapping applications could not implement in a standardized way, are now becoming increasingly common.
To support the development of OGC API Standards, the building blocks for location that standardize many of the new capabilities available to web mapping applications, the Open Geospatial Consortium (OGC) and EuroGeographics hosted the 2022 Web Mapping Code Sprint from November 29th to December 1st, 2022. The event was sponsored by OGC Strategic Member, Ordnance Survey, and was held as a hybrid event, consisting of a virtual element hosted on the OGC’s Discord environment alongside an in-person element hosted by EuroGeographics in Brussels, Belgium.
Code Sprints experiment with emerging ideas in the context of geospatial Standards, help improve interoperability of existing Standards by experimenting with new extensions or profiles, and are used for building proofs-of-concept to support standards-development activities and the enhancement of software products. Non-coding activities such as testing, working on documentation, or reporting issues are also conducted during a code sprint. In addition, the code sprints’ mentor stream provides an excellent opportunity to onboard developers new to the Standards.
The 2022 Web Mapping Code Sprint focused on the following:
-
OGC API – Tiles Standard: This Standard describes API building blocks that can enable implementations to serve map tiles, vector tiles (tiled feature data) or tiled coverage data.
-
OGC API – Maps candidate Standard: This candidate Standard describes API building blocks that can enable implementations to serve spatially referenced and dynamically rendered electronic maps.
-
OGC API – Styles candidate Standard: This candidate Standard describes API building blocks that can enable implementations to manage and fetch styles that consist of symbolizing instructions that can be applied by a rendering engine to features and/or coverages.
-
OtherStyles & Symbology Encodings (e.g., SLD, SymCore, etc.)
The mentor stream of the code sprint featured two tutorials, about understanding and using one server side and one client side implementation of OGC API - Tiles. It also included two onboarding sessions, focused on collaborating in software projects that implement the standards.
The code sprint successfully facilitated the development and testing of prototype implementations of OGC API Standards, including candidate Standards, that relate to web mapping. Further, the code sprint provided a foundation for the development of the next version of the Symbology Core Standard. Participants were able to provide feedback directly to the editors of the Standards and the editors were able to clarify any issues encountered by the sprint participants and the sprint also raised awareness about the Standards. The code sprint therefore met all of its objectives.
OGC is an international consortium of more than 500 businesses, government agencies, research organizations, and universities driven to make geospatial (location) information and services FAIR?—?Findable, Accessible, Interoperable, and Reusable. The consortium consists of Standards Working Groups (SWGs) that have responsibility for designing a candidate Standard prior to approval as an OGC Standard and for making revisions to an existing OGC Standard. The sprint objectives for the SWGs were to:
-
Create awareness about OGC Standards;
-
Develop prototype implementations of OGC Standards, including implementations of draft OGC Application Programming Interface (API) Standards;
-
Test the prototype implementations;
-
Provide feedback to the Editor about what worked and what did not; and
-
Provide feedback about the Standards and candidate Standards.
EuroGeographics is a not-for-profit organization that represents many of the National Mapping, Cadastral and Land Registration Authorities across Europe. The organization facilitates access to data, services, and expertise, as well as supporting the sharing of knowledge across the continent. The organization also publishes a product called Open Maps for Europe, which provided a useful resource for sprint participants. For example, within the first day of the code sprint, the sprint participants had implemented an OGC API -Maps façade in front of a Web Map Service (WMS) that was serving maps from the Open Maps for Europe product.
Ordnance Survey (OS) is the National Mapping Agency of Great Britain. OS publishes printed and digital maps, as well as offering access to the maps and data through a variety of APIs. In September 2022, OS launched the OS NGD API suite of products that implement a number of OGC API Standards. The Web Mapping Code Sprint therefore provided an opportunity for OS to directly support the advancement and implementation of the OGC API Standards on which the new OS NGD API products are built. The code sprint also provided an opportunity for OS engineers to directly engage with the editors of the Standards. Such access to editors and SWG members greatly accelerates development of applications.
Several more OGC Code Sprints are planned for the year 2023. To keep up to date with the latest plans, please visit [https:]]
-
-
17:45
A recap of the 124th OGC Member Meeting, Singapore
sur Open Geospatial Consortium (OGC)Tags: Member MeetingFrom October 3-7 ‘22 (and a little bit beforehand), more than 100 experts from across industry, government, and academia converged on the Lifelong Learning Institute in Singapore (with 150 more joining virtually) to strengthen global collaboration and attend OGC’s 124th Member Meeting. A big “thank you” goes out to our dedicated members that either attended in-person, or juggled lives across multiple timezones to attend virtually.
The meeting was sponsored by OGC Principal Member, Singapore Land Authority (SLA), and the Maritime and Port Authority of Singapore (MPA). The meeting carried the theme “Digital Twins: Land and Sea,” and was held in conjunction with the Singapore Geospatial Festival operated by GeoWorks.
Being our second “return to in-person” meeting really drove home how much more powerful they are compared to virtual, with a palpable energy across the many meetings, sessions, and events. Indeed, in the 8 years I’ve been the OGC Technical Committee Chair, I haven’t seen a meeting that generated more votes to approve Standards than the most recent one in Singapore. It’s great to be back in person!
Alongside the usual assortment of Standards Working Group (SWG) and Domain Working Group (DWG) meetings, the Member Meeting also saw several special sessions, such as: an Analysis Ready Data ad hoc; Digital Twins and the Marine Domain Special Session; the next Metaverse Special Session; an Urban Digital Twins Summit and ad hoc; a Land Administration Workshop; an OGC Start-ups and Scale-Up Special Session; and a meeting of the OGC Asia Forum. Throughout the various meetings and events, there was clear recognition that geospatial and “location” is everywhere – and not only that, but that it’s foundational to Digital Twins and related technologies.
Social events during the week included the usual welcome reception and ice-breaker on Monday evening, a networking dinner held at the delicious Majestic Bay Seafood Restaurant on Wednesday, and a Diversity Luncheon sponsored by OGC Principal Member AWS on Thursday.
Two areas of focus during the Member Meeting included OGC APIs and OGC’s ongoing work in Climate Resilience.
OGC APIs: The pace of release of our new OGC API Standards continues to increase. Multiple parts of OGC API – Features are now in final stages, OGC API – Tiles has its first part approved, and OGC API – Processes and OGC API – Environmental Data Retrieval have both been in use for over a year. All of these Standards are being developed and released with multiple implementations in the marketplace. Several other APIs are soon to be out for public comment and OGC is arriving at a model for documentation and governance of the building blocks upon which these APIs are constructed. The building block approach to developing implementations will be a new paradigm for Standardization – and one that we’re very excited about.
Climate Resilience: The Climate Resilience DWG is now active (and open to the public: join the mailing list here) and the Call For Participation in the Climate Resilience Pilot closes soon (don’t miss the Q&A Session on Nov 8 – join the mailing list for more info). These efforts will consider real-world use cases and investigate the role of location technology in addressing climate-related impacts on society. Numerous other OGC Working Groups also consider elements that contribute to climate science, so a forum in which these groups can collaborate will be very valuable.
A Joint OpeningThe opening of the Member Meeting was held jointly with the Singapore Geospatial Festival, and included:
Welcoming remarks from Colin Low, Chief Executive of the Singapore Land Authority; Captain M. Segar, Assistant Chief Executive of the Maritime and Port Authority of Singapore; and Dr. Nadine Alameh, CEO of OGC.
Following this, Dr. Amy Khor, Senior Minister of State for the Ministry of Sustainability and the Environment & Transport of Singapore, provided an Opening Address that highlighted the use of geospatial information and technology as critical components that have enabled Singapore’s economy and quality of life.
Trevor Taylor, Senior Director, Member Success and Development at OGC, then revealed the next Phase of the Federated Marine Spatial Data Infrastructure Pilot to undertake an Innovation Challenge for integration of terrestrial, maritime, and cadastral geospatial information (more informally, “land-sea interfaces”) as demonstrated in the context of Singapore.
Videos of Nadine’s opening remarks on ‘What OGC does’ and my overview of ‘How OGC does it’ are publicly available via OGC’s YouTube channel. OGC Members can access the rest of the presentations and recordings on this page in the OGC Portal.
Special Sessions
Attendees of the 124th OGC Member Meeting, Singapore (click to enlarge)The Analysis Ready Data ad hoc session explored the interest in OGC members for developing a multi-part Standard to address the framework and domain-specific parameters for generating analysis-ready data from Earth Observations. The work is intended to be jointly undertaken with ISO/TC 211. OGC Members can access the presentations and a recording on this page in the OGC Portal.
Digital Twins and the Marine Domain Special Session: Per the overall theme of this Member Meeting, the Marine Domain Working Group (DWG) held a half-day session on Digital Twins and the Marine Domain. The session began with a summary of the technological and policy environments in which marine digital twins must be framed, then included more detailed highlights of past and ongoing initiatives being operated under OGC’s Innovation Program. A Digital Twin Challenge is being proposed to be focused in Singapore. Participants then engaged in a discussion of future activities to be undertaken by the Marine DWG, including some in concert with partner organizations. OGC Members can access the presentations and a recording on this page in the OGC Portal.
Metaverse Special Session: The Metaverse discussions in OGC are converging on a proposal to create a new DWG to address the various OGC activities in the Metaverse and coordinate with affiliated organizations, such as the Metaverse Standards Forum, of which OGC is a founding member. OGC members are already working actively in metaverse services now, such as were demonstrated in the session by Hexagon, Esri, and Cesium. Clearly, the Metaverse must include digital twins, and the OGC CDB Standard has relevant capabilities. OGC Members can access the presentations and a recording on this page in the OGC Portal.
Urban Digital Twins Summit: The past two OGC Member Meetings have been building interest and focusing the landscape for further work on Urban Digital Twins. The Smart Cities DWG is rechartering to address the topic. The DWG held an Urban Digital Twins Summit to highlight perspectives from members’ organizations on what type of work is occurring or is forecast to happen in the domain as well as to identify where OGC can best contribute to common enablement of Urban Digital Twin infrastructure for a variety of use cases. The Summit was linked to further exploration in the 3DIM DWG and MUDDI SWG. OGC Members can access the presentations and a recording on this page in the OGC Portal.
Land Administration Workshop: The Land Administration DWG hosted an Interoperability Workshop to detail real examples of implementations of the ISO Standards for Land Administration and to discuss the encodings and links necessary to enable replicable land administration practices. The two blocks of content provided presentations on implementations of the Standards followed by discussion of successes and remaining work. OGC Members can access the presentations and a recording on this page in the OGC Portal.
OGC Start-ups and Scale-Up Special Session: OGC continues to attract new start-up companies and those scaling to more significant market presence. Five OGC small business members highlighted their capabilities: ALTZ Technologies, i-bitz, Duality Robotics, XYZT, and KorrAI. OGC Members can access the presentations and a recording on this page in the OGC Portal.
Most OGC Member Meetings include a session dedicated to the local regional forum. The OGC Asia Forum met on Friday with a wide variety of presentation topics from throughout the region. OGC Members can access the presentations and a recording on this page in the OGC Portal.
Today’s Innovation, Tomorrow’s Technology, and Future DirectionsDue to its broad applicability, the popular Future Directions session runs unopposed on the schedule so that all meeting participants can attend. At this meeting the session focused on Reference Architectures. Dr. Gobe Hobona of OGC introduced the session and was then followed by interactive presentations to gauge member interest in advancing Reference Architectures.
Dr. Sam Meek (Helyx), Dr. Ingo Simonis (OGC), and Rob Atkinson (OGC) jointly presented the considerations made to date on establishing a Reference Architecture or references to architectural elements that can be used in Standards and Standards-based architectures. The topics included the following:
- Modernization of the OGC Reference Model (ORM) in the light of its original purpose and how a new ORM might be better used for assisting in architectural guidance. A discussion with members highlighted that OGC Standards fit within a larger Information Technology (IT) ecosystem and should not necessarily be described fully independent of that ecosystem.
- A brief summary of ongoing discussions in the OGC Architecture Board (OAB) about the future of the OGC Abstract Specification, specifically whether some Topics should be retired and new Topics added that underlie newer OGC Standards and reflect modern IT practice.
- The question was raised whether the ORM should document architectural elements and patterns or whether the ORM should drive architectural decisions in developing Standards and implementations. The ORM is forecast to become more prescriptive.
- Finally, Rob Atkinson discussed the practical approach to describing and relating architectural components via a knowledge graph. Such an approach is prototyped using OGC’s Definitions Server. Some further information about the OGC Definition Server is available here.
OGC Members can access the presentations and a recording on this page in the OGC Portal.
Closing PlenaryFor this meeting, and from here on out, the Closing Plenary has been restructured to fall across two sessions – Important Things and the traditional Closing Plenary.
Important Things: this session started with a rapid, 15-minute summary of the entire meeting week by Scott Simmons. Slides and content from a large number of Working Group sessions were included. OGC Members can access the presentation on this page in the OGC Portal.
The Important Things session then featured a discussion around the question of “what is the difference between the metaverse and a digital twin?” There was a lively conversation amongst members to define each term and what was included in the scope of the term. The general consensus is that digital twins represent items that can be found in the real world (imagined or not) and that the metaverse is the environment in which actions might occur on or around digital twins. There were far more subtleties in the conversation than can be summarized in this blog post, but notes from the session are posted publicly on the OGC Member Meeting Topics GitHub Repo.
Closing Plenary: Two new Principal members to OGC, the General Authority for Survey and Geospatial Information (GASGI), Saudi Arabia and the National Department of Agriculture Land Reform and Rural Development, South Africa, each presented their duties and use of Standards. These presentations were followed by reports from the Working Groups that covered the outcomes of the last few days.
Thank you to our communityAll in all, our 124th Member Meeting was a big success. It was wonderful seeing members interacting, collaborating, and driving technology and standards development forward. It’s especially exciting as it comes at a time when geospatial is truly everywhere. Once again, thank you to our members for their time and energy, as well as their dedication to making OGC the world’s leading and most comprehensive community of location experts.
Be sure to join us for the 125th Member Meeting, happening late February, 2023. Registration and further info will be available soon on ogcmeet.org. Sponsorship opportunities are also available – contact us for more info. Subscribe to the OGC Update Newsletter to stay up to date on all things OGC including when registration goes live for our Member Meetings.
-
16:01
Has the Edge dissolved itself already? Or is the Edge the new Cloud?
sur Open Geospatial Consortium (OGC)Tags: Edge computing, interoperability, vendor lock-inVisiting the Edge Computing World conference last week, I observed a number of interesting aspects that I would like to share here. First of all, the reason for my title is my observation that technical terms such as Edge or Cloud are used rather freely these days, almost causing the terms to become meaningless. Almost any aggregation node in a distributed architecture was called a “Micro Cloud” or “Cloud at the Edge.”
In doing so, the ignorance of some key characteristics of the cloud apparently mattered little; scalability or elasticity first and foremost. From the perspective of the edge node, the aggregation node does not offer any characteristic cloud functions (the aggregation node is basically a black box, an aggregation and processing unit without any other properties). From the point of view of nodes located higher up in the hierarchy, the aggregation node also does not offer any characteristic cloud functions. Of course, additional edge nodes cannot simply be created; after all, we are usually talking about hardware here and not about virtual capacities such as additional virtual machines.
It makes little sense to reduce Edge and Cloud to such a small subset of their properties, as both terms then degenerate into empty shells rather than meaningful concepts. If one does so, however, the question arises as to whether Edge is already in the process of dissolution, or whether we need to redefine terms (although there is actually little reason for this in purely factual terms) to reflect the changed view of the various layers within the Edge-Cloud continuum. This raises the suspicion that business-motivated forces are at play.
Edge Interoperability? Or Vendor Lock-In?This brings me to another aspect: interoperability. With traditional service providers now selling hardware (e.g., Amazon Snow) and traditional hardware providers selling services (e.g., Schneider Electric), the number of players on both sides increases. This bears the risk of reduced interoperability, as the new full spectrum players sell the advantages of “homogeneous solutions” to their customers. As long as one remains within a system, there will certainly be advantages, since hardware and software are coordinated with each other and can fall back on system-specific exchange mechanisms. However, vendor lock-in is inevitable when vendors define their own exchange formats, interfaces, and conceptual models. As soon as different systems have to be integrated, the development of corresponding bridges or transformations will be unavoidable.
With the growing number of systems and corresponding platforms (which are two other terms that are frequently used interchangeably), the number of platform-specific formats and interfaces increases – and interoperability suffers. What has been achieved in other domains, such as Earth Observation – where agreements on standardized interfaces and data models have boosted interoperability – is still somewhat new to the Edge community. The Edge community is in the very early stages of moving toward interoperable (perhaps even open) systems that significantly simplify the generation of complex workflows across system or platform boundaries – or enable them in the first place. Going beyond individual systems is unavoidable: multi-system workflows enable deep insights into domain-specific systems or environments and are necessary to holistically address the grand challenges of our century, such as our changing climate.
Sustainable solutions for the greater goodThere is currently still a lot of money to be made with custom platforms. It remains to be seen to what extent these platforms will be suitable for addressing the major challenges of our century. Edge is in a gold-rush mood and I don't begrudge anyone developing business successes from it. However, the world is extremely complex and I doubt that this complexity can be sufficiently taken into account with the current systems trimmed for depreciation. I heard about examples of fish farms running fourteen parallel Edge systems to monitor the status of the farm. This is fourteen parallel dashboards. Other organizations report that they maintain over 100 software solutions to monitor the health status of manufacturing machines. Most of these are not interoperable, which results in additional costs as soon as several machines form a unit that needs to be monitored as a system. So, where do we stand with Edge? With over 270 trillion USD projected revenue over the next 30 years (numbers McKinsey reported at the conference), climate change alone will produce many new unicorns. Let's hope that interoperability doesn't fall by the wayside – or that a sufficient number of these unicorns make their profit with sustainable solutions that contribute to the greater good.
-
14:59
Reflections on the 2022 Joint OGC & ISO Code Sprint - The Metadata Code Sprint
sur Open Geospatial Consortium (OGC)Tags: OGC API, ogcapi, Sprint, STAC, OGC API - Records, JSON-FG, OGC API - Features, ISOOver the past two decades, standards such as ISO 19115:2003 and the OGC Catalog Services for the Web (CSW) have been integrated into several Spatial Data Infrastructure (SDI) initiatives at national and international levels. These standards leveraged the Extensible Markup Language (XML) which, at the time, was the primary encoding for data exchange in IT. In recent times, however, the uptake of JavaScript Object Notation (JSON) and Web Application Programming Interfaces (APIs) has necessitated the modernization of existing metadata and catalog approaches.
In November 2021, the Open Geospatial Consortium (OGC) and Technical Committee 211 (TC 211) of the International Organization for Standardization (ISO) held their first joint code sprint. The success of that first joint code sprint provided the foundation for a second joint code sprint, held September 14-16, 2022. The second joint code sprint, named the 2022 Joint OGC and ISO Code Sprint?—?The Metadata Code Sprint, served to accelerate the support of open geospatial standards that relate to geospatial metadata and catalogs. The code sprint was sponsored by Ordnance Survey (OS) at the Gold-level and Geonovum at the Silver-level. Unlike the first, virtual, code sprint, this sprint was held as a hybrid event, with the face-to-face element hosted at the Geovation Hub in London, United Kingdom.
The code sprint focused on the following group of specifications:
- OGC API?-?Records candidate Standard
- ISO 19115 metadata Standards (i.e., ISO 19115-1, ISO 19115-2, ISO 19115-3)
- OGC Features and Geometries JSON (JSON-FG) candidate Standard
- Spatio-Temporal Asset Catalog (STAC), which leverages the OGC API?-?Features Standard
The discussions during the code sprint covered topics such as harmonization of STAC and OGC API - Records; harvesting of metadata to populate instances of OGC API - Records; the possibility of a JSON-FG encoding for OGC API Records and STAC; the possibility of a JSON encoding of ISO 19115; and others.
The demonstrations showcased at the end of the code sprint included client-side and server-side implementations of OGC API - Records, JSON-FG, STAC, and ISO 19115 metadata. A high-level overview of the sprint architecture is shown below.
A high-level overview of the architecture of the OGC ISO 2022 Metadata Code Sprint (click to enlarge)The code sprint successfully facilitated the development and testing of prototype implementations of OGC and ISO Standards that relate to geospatial metadata and catalogs. The code sprint also enabled the participating developers to provide feedback to the editors of candidate standards. The code sprint therefore met all of its objectives and achieved its goal of accelerating the support of open geospatial standards that relate to geospatial metadata and catalogs.
The sprint participants made the following recommendations for future innovation work items:
- Initiatives to facilitate implementation of JSON-FG (e.g. three-dimensional (3D) data, cadastral data, etc.)
- Initiatives to facilitate implementation of catalogs
- Prototyping of tools for creating metadata (e.g. the automated STAC metadata crawler and dataset tagger demonstrated during the sprint)
The sprint participants also made the following recommendations for activities that the Standards Working Groups should consider:
- Outreach for promoting JSON-FG
- Code Sprint for designing profiles of JSON-FG for different communities of interest
- Documentation of the different roles of catalogs and API, as well as guidance on when to use them
- Code Sprint on versioning, possibly involving both OGC API?-?Records and OGC API?-?Features
- Exploring how to move GeoDCAT forward within OGC
This was the first hybrid code sprint (consisting of both in-person and remote elements) organized by OGC in more than two years, due to the pandemic. A record number of participants registered to attend the code sprint, exceeding pre-pandemic registration numbers. There were however, more remote participants than those attending in-person. This suggests that there continues to be significant interest in code sprints, and that the online collaboration environment that OGC uses in code sprints should continue to be used post-pandemic.
From OGC, ISO, and Sprint Sponsors Ordnance Survey and Geonovum, we send a big thank you to everyone that participated. We look forward to seeing you at the next code sprint, the Web Mapping Code Sprint, on November 29th to December 1st, 2022, in Brussels, Belgium.
Further information on the Sprint is available on the Metadata Code Sprint wiki.
-
18:44
Bringing the Heat in Madrid: a recap of our 123rd Member Meeting
sur Open Geospatial Consortium (OGC)Tags: Member MeetingOGC’s 123rd Member Meeting – our long awaited return to in-person(!) – was held in Madrid, Spain, from June 13-16, 2022. And even with the heatwave temperatures approaching 40°C (104°F), it was truly wonderful to be there among our members and broader community once again. We’re all grateful for the connections that teleconferencing and video calls were able to maintain during the pandemic lockdowns, but I don’t think I’m alone when I say that they’re no substitute for seeing people in real life.
The meeting was sponsored by the EU Satellite Center (SatCen) and recognized the 30th anniversary of SatCen as an organization. More than 150 people attended the OGC Member Meeting in person, with another 200+ virtual. Attendees included key standards leaders and regional experts from industry, academia, and government.
The Member Meeting featured the usual assortment of Standards Working Group (SWG) and Domain Working Group (DWG) meetings, as well as special sessions, social events, and all the impromptu conversations, break-aways, sight-seeing, and general interaction that comes with (finally) being in-person again.
The social events included a lively networking reception and ice-breaker on Monday evening, an inspiring Women in Geospatial Breakfast on Wednesday morning, and the OGC Member and VIP Dinner, held at the delicious Restaurante Amicis La Terraza in central Madrid - with a most welcome large courtyard to stay cool(ish..) as the hot day turned into a warm evening.
Two strong areas of focus during the Member Meeting included The Metaverse/Digital Twins and Marine.
Metaverse & Digital Twins: These topics are intrinsically related, and during the Member Meeting there was a Metaverse Special Session (see below) as well as a Digital Twins Coordination Session on top of the Digital Twins DWG meeting. OGC will continue its strong focus on the Metaverse & Digital Twins – especially with an Urban Digital Twins Summit in Singapore at the 124th OGC Member Meeting in October. Industry alignment around core elements of the Metaverse continues, and OGC is an important part of these coordination activities. Expect to see linkages to other work in OGC and other organizations (such as the newly launched Metaverse Standards Forum, of which we are a founding member) as we identify the next steps to achieve interoperability in the Metaverse.
Marine: OGC’s Federated Marine Spatial Data Infrastructure Pilot continues to be successful and to add new phases of work with additional sponsors and topics. The Marine DWG is coordinating closely with the International Hydrographic Organisation (IHO) and the United Nations Working Group on Marine Geospatial Information to develop partnerships to test against real-world use-cases in this pilot and other projects. The next Member Meeting will include a special session or workshop on marine data integration.
The Kick Off
OGC CEO Dr. Nadine Alameh presenting at the Opening PlenaryTo start the week, the Kick-Off Session opened with a welcome from OGC’s CEO, Dr. Nadine Alameh. Nadine provided “OGC by the numbers” to highlight the Consortium’s success over the past few years and mapped where we will be going. She also encouraged engagement from the meeting participants, a large number of which were first-time attendees.
Next up, a keynote presentation from Lucio Colaiacomo of SatCen highlighted SatCen’s work and use of OGC Standards – as well as celebrating 30 years of activity.
Following this, our regular “fireside chat” featured Dr. Sofie Haesevoets, Senior Product Manager at OGC Principal Member, Hexagon, and Dr. Nadine Alameh highlighting Hexagon’s use of open Standards and the business value gained from their use.
Trevor Taylor, OGC Senior Director of Member Success and Development, welcomed the newest members in OGC and gave an overview of the success in recruiting and retaining members in a difficult year.
Finally, Dr. Ingo Simonis, OGC Chief Technology Innovation Officer, provided a tour of cloud-native innovation activities that have been – and continue to be – performed by OGC and its members under a number of Innovation Initiatives. Ingo described how OGC is now working on Function as a Service (FaaS) to cap all of the other service-centric cloud businesses. OGC Members can access Ingo’s presentation on the OGC Portal here. If you haven't already, you should also check out OGC Chief Standards Officer Scott Simmons' recent blog that provides an overview of The Latest on Cloud-Native Geospatial Standards in OGC.
Special Sessions
OGC Chief Technology Innovation Officer Dr. Ingo Simonis discusses some of the benefits of Cloud Native GeospatialThe week also saw several Special Sessions, including the Metaverse is Geospatial, Startups, Integrated Technologies for Climate Resilience, a Digital Twins coordination session, a Developer Workshop, and both the Europe Forum and the Iberian and Latin American Forum. Let’s go through each.
The Metaverse is Geospatial: It is impossible to represent a virtual rendition of our (or any) world without considering the importance of location. Those location/geospatial components include digital twins and semantic relationships between modeled objects and other data. The session offered presentations on the digital twin aspect of the linkage offered by geosemantics and continued OGC’s push toward anchoring the real world in the virtual. Some of the session presentations and a recording are available to OGC Members on the OGC Portal here. For more about the relationship between OGC, the Metaverse, and broader geospatial, be sure to check out our recent blog post with the same name: The Metaverse is Geospatial.
OGC Startup members are helping in “Setting the Standards.” Seven OGC small business members highlighted some of their work and were joined by industry veterans from HuaWei, SpaceTec Partners, the Location Based Marketing Association, and SparkGeo. OGC Members can access the presentations from the session on the OGC Portal here.
Climate Resilience: OGC is preparing to launch a Climate Resilience Domain Working Group (DWG). Dr. Nils Hempelmann of OGC brought together six speakers to discuss topics such as the workflow chain from the provenance of climate data to creating decision-ready data, and the use of artificial intelligence. OGC Members can access the presentations from the session on the OGC Portal here.
Digital Twins Coordination Session: OGC has a number of working groups discussing digital twins for their specific domains. The general landscape of digital twins was discussed in the Future Directions session (see below), so the Digital Twins Coordination Session focused on urban digital twins and the coordination necessary between working groups to work toward a common set of practices and Standards. OGC’s Dr. Josh Lieberman introduced the participants to the Urban Digital Twins Location Powers event outcomes and then each relevant working group presented their aspects of digital twin conception and modeling. The meeting concluded with plans to hold a more substantive summit in October at the next OGC Member Meeting. OGC Members can access the presentations from the session on the OGC Portal here.
OGC now operates a Developer Workshop at each Member Meeting. In Madrid, the workshop was held on Friday at the offices of OGC Technical Member Carto. The theme of this workshop was “Cloud-Native Geospatial” with tutorials and discussions on GeoParquet, Earth Observation application packages, STAC, COG, and more. Details on the program are available on OGC’s GitHub
The Europe Forum held a session with a diverse assembly of speakers addressing topics including the OGC Disaster Pilot, the UK Geospatial Standards Register, imagery-derived digital twins, and space technology. OGC Members can access the presentations from the session on the OGC Portal here.
The Iberian and Latin American Forum met on Friday with a program focused on the interoperability enabled by the OGC API Standards. The majority of the meeting was held in Spanish to accommodate local participants. OGC Members can access the presentations from the session on the OGC Portal here. The session is also available to view on YouTube.
Today’s Innovation, Tomorrow’s Technologies, and Future Directions
OGC's Director of Product Management, Standards, Dr. Gobe Hobona opens the always fascinating Future Directions sessionTuesday morning’s Future Directions session ran unopposed on the schedule so that all meeting participants can attend this always fascinating session. This meeting’s session focused on “Digital Twins of the Environment.” Dr. Gobe Hobona of OGC introduced the session and was followed by several speakers and a panel.
Dr. Ingo Simonis and Rob Atkinson of OGC described reference architecture aspects of digital twins. Understanding that reference architectures quickly lose value if they become stale, the speakers previewed some concepts of an underlying knowledge base from which architectures could be defined and described.
Louis-Martin Losier of OGC Technical Member Bentley Systems highlighted recent success stories around the integration of built and natural environment digital twins in a common platform. These examples included water management, geothermal power optimization, and air pollution modeling.
James Carey of OGC Strategic Member the UK Hydrographic Office presented the use of hydrographic data to represent digital twins of the ocean with examples showing how data are integrated for offshore wind farm development and the need for detailed data to more accurately model storm effects on tides.
A panel of the presenters then discussed questions from meeting participants and elaborated on some of the topics covered by other presenters.
Following the panel was three more presentations:
Piotr Zaborowski of OGC and Arne Berre of SINTEF described the ILIAD project, which is developing a model for digital twins for the ocean, with sector-specific local twins. OGC is a partner in this project, which builds upon other European Initiatives to develop digital twin architectures for the ocean.
Rashmit Singh Sukhmani of OGC Member SatSure highlighted his company’s use of remote sensed data and artificial intelligence in processing data to develop soil moisture models at high-resolution for use in agriculture, wildland fire mitigation, and more.
Tien Yen Chou of OGC Principal Member Feng Chia University presented “The Present and Future of Digital Twins in the Lifecycle Management of Disaster Early Warning” illustrating the benefit of integrating digital twins of the landscape with real-time sensor data as a practical means to mitigate the effects of disasters.
OGC Members can access the presentations from the session on the OGC Portal here.
Closing PlenaryClosing out (most of) the Member Meeting was the Closing Plenary, which began with a Keynote presentation from Philippe Cases, the CEO of OGC Associate Member Topio Networks. Philippe shared Topio’s research on the scope and scale of the Geospatial Landscape, factoring the work of more than 600 companies and a market that is projected to grow in value from USD 59.5 billion in 2021 to USD 209 billion in 2027. He highlighted four trends of particular interest: explosion of data, emergence of digital terrains, explosion of applications/platforms, and making the geospatial ecosystem FAIR (Findable, Accessible, Interoperable, and Reusable). Don’t forget to check out Topio Networks’ Location Information News Aggregator hosted on ogc.org, which is designed to help decision-makers, industry analysts, and technology developers keep pace with the accelerating pace of innovation.
A second Keynote was provided by Asif Khan, the founder of the Location Based Marketing Association and GroundLevel Insights. Asif illustrated the importance of location in business through several examples and videos. He highlighted interesting business opportunities offered by improved real-time location information and artificial intelligence in image recognition.
All in all, our return to an in-person Member Meeting was a wild success, showcasing the state of the location industry, highlighting the latest innovations coming from OGC, conducting important Standardization work, and creating and strengthening connections across the diverse OGC community.
Be sure to join us for the 124th Member Meeting, happening October 3-7, 2022, in Singapore with the theme “Digital Twins: Land & Sea.” Registration is open now.
Sponsorship opportunities are still available.
Attendees of the OGC Member and VIP Dinner enjoying the courtyard of Restaurante Amicis. -
15:40
The Latest on Cloud-Native Geospatial Standards in OGC
sur Open Geospatial Consortium (OGC)Tags: cloud-native geospatial, Member Meeting, OGC API, ogcapiI thought that it would be valuable to share an update of cloud-native geospatial activities in OGC, especially in light of our recent very successful Cloud-Native Outreach Event. This blog follows-up on the vision shared by OGC’s CEO, Dr. Nadine Alameh in April 2022 and two posts by OGC’s Visiting Fellow, Chris Holmes: Towards a Cloud-Native OGC and Towards a Cloud-Native Geospatial Standards Baseline.
For many years, OGC has been working on numerous aspects of the entire ecosystem of location data in cloud environments. Starting with Testbed 10 in 2013, OGC has been publishing engineering guidance on cloud topics, such as the Testbed 10 Performance of OGC® Services in the Cloud: The WMS, WMTS, and WPS cases. From those earliest efforts, OGC members have recognized that our approach to enabling cloud-native geospatial capabilities must be inclusive of this whole ecosystem: formats, services, architectures, and operations. I summarized this perspective at the Outreach Event discussing Advances in OGC Cloud-Native Geospatial Activities and will further elaborate in this blog post.
The cloud ecosystem is more than just the platform in which the data lives and is operated upon, but also includes: the algorithms to process information; interfaces between both humans and machines; formats to store and retrieve information; the security regime for content and access; business operations and revenue models to sustain the environments; regulatory oversight which may impact what enters or leaves the cloud; and much, much more. “Ecosystem” is truly the correct term as you can imagine an almost 1-for-1 analoge from the cloud to a natural ecosystem.
Building an EcosystemThe remainder of this blog digs into the elements of the ecosystem that OGC is addressing: interfaces, applications, encodings, and operations.
To start, we really cannot talk about geospatial in the cloud without also talking about the web: it is through web resources that so many users interact with cloud-hosted data and functions. OGC and the World Wide Web Consortium (W3C) collaborated in 2017 to publish the Spatial Data on the Web Best Practices as a means to illustrate how to make geospatial information more web-native. Web-native makes cloud-native more approachable. It is not enough to store data in the cloud in formats that improve access and analysis performance: we also need to develop APIs to discover, process, and extract information from the cloud and guide users to be able to work across cloud instances hosted by multiple providers. The impact of web-centric Standards modernization in OGC on enabling the cloud ecosystem cannot be overstated.
These APIs include OGC API - Features, foundational to accessing feature (vector) data as well as underpinning the STAC API specification, used for rapid discovery of remote sensing and other data. Extending the catalog paradigm, OGC API - Records allows discovery and access to all types of geospatial data as detailed as the record level. The architecture of these APIs allows developers to implement “just enough geo” to get to the data they need without becoming geospatial experts.
Many people identify the key use case for cloud-native capabilities to be the handling of massive data cubes, be those stacks of imagery or multidimensional scientific data sets. But just because you can store all of your data on the cloud does not mean that you want to use all of the data all of the time. OGC API - Environmental Data Retrieval (EDR) allows for complex subsetting of data cubes to return (or point to) just what is needed.
Do you need to fuse Internet of Things sensors with your massive content holdings? Leverage the OGC SensorThings API Standards. Consider that the combination of disparate data sources and dynamic sensors typically need some degree of processing to extract useful information, so implement OGC API - Processes to work between and within multiple data sets and feeds.
Processing comes in many models, but highly important these days is the use of Artificial Intelligence to distill vast quantities of data into useful information. OGC’s Artificial Intelligence in Geoinformatics (GeoAI) Domain Working Group is tackling some of the use cases and identifying targets for interoperability and even Standardization for information flow and quality. For example, the characterization of training and validation data used in GeoAI is now being standardized in the Training Data Markup Language for AI Standards Working Group. As part of this ecosystem, highly-automated data processing and analysis brings extraordinary benefits from cloud-native geospatial data.
The formats are also critically-important. I referenced a couple of blogs from Chris Holmes at the top of this post where there are excellent descriptions of several cloud-native encodings in wide (or soon to be wide) use. Understand that it is not just the structure of these encodings that make them “cloud-native,” but also the means by which the data are accessed (usually web-native, i.e., [HTTP).] Thus, many OGC-Standard encodings, such as GeoPackage, can be cloud-native. Below, I highlight several formats that are currently maturing in OGC.
OGC standardized GeoTIFF in 2019 and since that time, has been working to standardize Cloud Optimized GeoTIFF (COG) for management of raster data. Starting with the COG library, OGC has been working to document the format as a formal Standard and is nearing completion of this work. A draft specification is available as the OGC Testbed-17: Cloud Optimized GeoTIFF specification Engineering Report; the Standard won’t be too far behind.
More complex multidimensional data has proven to be efficiently encoded in the cloud using Zarr. Zarr is also in the final voting for endorsement as an OGC Community Standard. OGC’s most recently completed Testbed evaluated the suitability of Zarr for handling geospatial data cubes in the OGC Testbed 17: COG/Zarr Evaluation Engineering Report and Zarr did just fine… as did COG.
Feature (vector) data is already handled on the cloud in all types of databases that rely upon OGC Simple Features, OGC’s most widely-implemented Standard, to encode the geometry. But is this management really cloud-native, particularly with respect to streaming the data to users? Other encodings are being considered. GeoParquet is currently being incubated in OGC as a prospective cloud-native vector format. Other formats, such as FlatGeobuf are also being considered as potential Community Standards, to join existing Standards such as Indexed 3D Scene Layers and 3D Tiles, both of which offer cloud-natve capabilities, particularly with delivery of data.
Putting it together in the real worldYou have read this far and see a whole bunch of references to individual Standards and specifications that address specific parts of the cloud-native geospatial ecosystem. Putting it all together requires practical application of these technologies, Standards, and specs in concert. Operation of the cloud ecosystem requires coordination of many disciplines and sometimes new architecture designs relative to our past use of monolithic systems (such as microservices and highly-composable systems). This is where the other half of the OGC is so critical. The OGC Innovation Program operates numerous initiatives each year to experiment with or pilot the capabilities listed above against real-world scenarios and deliver documentation and examples that can be re-used for many use cases.
A search of “cloud” in the Engineering Report repository returns reference to 20 documents, each highlighting practical application of the capabilities highlighted above and more. These documents can be put in the context of the cloud-native ecosystem as illustrated below.
As you can see, the Innovation Program initiatives have touched upon many aspects of the cloud ecosystem, even if only peripherally related to location technology. These Engineering Reports reference even more work of relevance and identify specific practices that are portable across many use cases. I also recommend the recent OGC Best Practice for Earth Observation Application Package, which details packaging and deployment of Earth Observation Exploitation Platforms, generally to a cloud environment.
Development and MaturationIn summary, I’ve touched upon a lot of Standards and resources and there are many more in the OGC and through our partner organizations. Each, literally EACH, of these efforts requires considerable investment in time and resources. The dedication of OGC Members to advance this work is becoming increasingly represented in the cloud ecosystem. The fact that so many major cloud service providers (e.g., AWS, Google, Microsoft, Oracle) are OGC Members highlights the relevance of OGC’s efforts in this domain.
The Standards are being matured and we have expert guidance on deployment and management of the capabilities. Expect to see dedicated developer and implementer resources from the OGC to foster consistent use of geospatial content in cloud ecosystems. We will continue to research best practices, publish guidance, and identify capabilities offered by our members to sustain the entire location industry.
To see this work evolve in real time, be sure to attend the upcoming 123rd OGC Member Meeting in Madrid, Spain. More specifically, join our Developer Workshop on Friday, 17 June, 2022 hosted by Carto in downtown Madrid to code against the encodings, APIs, and more with experts from OGC and our member organizations.
-
13:48
The 2022 Joint OGC OSGeo ASF Code Sprint - How it went!
sur Open Geospatial Consortium (OGC)Tags: ogcapi, OGC API, Sprint, OSGeo, ASFOver the past decade, geospatial technologies and data have become more widespread in use and application. A key catalyst for the increased uptake of geospatial technologies is the interoperability achieved through the implementation of Open Standards. Another important catalyst for this increased uptake is the availability of Open Source software products that are able to extract, transform, analyze, and disseminate geospatial data.
Back in February 2021, the Open Geospatial Consortium (OGC), the Apache Software Foundation (ASF), and the Open Source Geospatial Foundation (OSGeo) held their first joint Open Source Software and Open Standards Code Sprint (for full technical outcomes, see the Joint OGC OSGeo ASF Code Sprint 2021 Summary Engineering Report).
The success of that first joint code sprint provided the foundation for a second joint code sprint in March this year. The 2022 Joint OGC OSGeo ASF Code Sprint, conducted between March 8-10, had the goal of accelerating the support of open geospatial standards within the developer community.
Part of the motivation for holding the code sprint in 2022 was the growing uptake of location information across the global developer communities. The code sprint brought together developers of Open Standards, Open Source Software, and Proprietary Software. The code sprint therefore provided a rare opportunity for developers across these communities to focus on common challenges, within a short space of time, and in a shared collaborative environment.
The 2022 Joint Code Sprint introduced several changes not seen during the 2021 Joint Code Sprint:
First, Discord was used to aid in collaboration. Discord allowed both chat and video communications to be offered from within the same environment. Discord also supported the creation of multiple chat channels, thereby making it possible for separate projects to have their own dedicated chat channels. Included in these channels was a dedicated chat channel for the event sponsor, OGC Strategic Member Ordnance Survey, which made it possible for sprint participants to visit the channel and ask about the sponsor’s products.
Second, the code sprint offered Mentor Streams that presented tutorials for developers who were new to using featured standards or software products.
Over a period of 3 days, the sprint participants collaborated on a variety of coding and documentation tasks, and held discussions to facilitate coordination. The sprint participants made the following recommendations for future innovation work items:
- Prototypes of catalogs that can be crawled by an application. While there are currently several searchable catalogs, no catalogs can yet be crawled through by applications.
- More specification validation work for OGC API Records.
- More experiments for the Workflows extension of OGC API Processes. This could try out a variety of workflow approaches.
- Experimentation on how a processing server can interact properly with other OGC API implementations that serve data. For example, in this code sprint there was an implementation of OGC API Processes (ZOO Project) that interacted with an OGC API Features implementation (MapServer).
- Experimentation with OGC’s geoparquet candidate standard and Apache Arrow.
The sprint participants also made the following recommendations for things that the Standards Working Groups should consider:
- To improve examples and documentation related to OGC API Records.
- To advance the development of the Executable Test Suites of OGC API Processes, OGC API Tiles, and OGC API Coverages.
The code sprint facilitated the development and testing of prototype implementations of OGC standards, including implementations of draft OGC API standards. Further, the code sprint also enabled the participating developers to provide feedback to the editors of OGC standards. Furthermore, the code sprint provided a collaborative environment for OSGeo and ASF developers to fix open issues in products, develop new features, improve documentation, improve interoperability with other libraries/products, and develop prototype implementations of OGC standards. The code sprint therefore met all of its objectives and achieved its goal of accelerating the support of open geospatial standards within the developer community.
Keep your eye out for the forthcoming Joint OGC OSGeo ASF Code Sprint 2022 Summary Engineering Report that will document the technical achievements of the code sprint, available July ‘22.
For information on OGC Sprints, including outcomes and Engineering Reports of previous Sprints, as well as info on future Sprints, visit the OGC Sprints webpage.
-
17:02
The Metaverse is Geospatial
sur Open Geospatial Consortium (OGC)Tags: metaverseA version of this article originally appeared in the Winter 2021 issue of GeoConnexion International Magazine.
With momentum and interest once again building around the ‘metaverse’, OGC hosted a ‘Metaverse Ad-Hoc Session’ at its virtual 121st Member Meeting in December 2021. The session saw speakers from across industry - from photogrammetry and AI-enhanced semantic remote sensing companies to geospatial, BIM, and gaming software companies - discuss how geospatial tech will inform the metaverse, how the metaverse will transform geospatial, and why open standards will be critical for the metaverse’s success.
What’s a metaverse, anyway?But before we get too far, what even is the metaverse? I asked Patrick Cozzi, CEO of (OGC Member) Cesium, co-host of the Building The Open Metaverse podcast, and panellist at the Metaverse ad-hoc.
“Ask ten different people, you’ll get ten different answers, but what most folks are agreeing on is that the metaverse is a progression of the internet to go from something that’s 2D to fully immersive 3D,” said Patrick Cozzi. “You’ll also hear definitions around it being a persistent, virtual world that allows collaboration of any sense, from gaming, to enterprise, to DoD [Department of Defense] cases. I think it’s a super exciting time to be in geospatial as this all comes into one place.”
This lines up with the definition put forward by venture capitalist Matthew Ball, who has written extensively on the subject of the metaverse in his Metaverse Primer:
“The Metaverse is a massively scaled and interoperable network of real-time rendered 3D virtual worlds which can be experienced synchronously and persistently by an effectively unlimited number of users with an individual sense of presence, and with continuity of data, such as identity, history, entitlements, objects, communications, and payments.”
Panelists at the recent OGC Metaverse Ad-Hoc SessionBut what will the metaverse look like to the end-user? First of all, virtual/augmented reality hardware won’t be mandatory: just like the internet, it will adapt to the device accessing it, whether it be 2D, 3D, small screen, big screen, headset etc. Also like the internet, the metaverse will comprise of many different interconnected 3D ‘spaces’ (like 3D websites) operated by different entities that together form the much larger metaverse concept.
Metaverse spaces will include those forming completely fabricated virtual worlds as well as those that are modelled after, or augment, the real world. Metaverse spaces will be interconnected, with users being able to cross between them, whether it’s to visit a friend, play a game, go shopping, manage a construction project, train for a new job, model a new warehouse workflow, or something else entirely.
Users may also be able to extend and affect the real world with actions and items being able to move between both. For example, items purchased or earned in a shop on a virtual High Street in the metaverse could be redeemable at its real-world counterpart, or buttons pressed in the metaverse could actuate machines or objects in the real world.
Those metaverse experiences representing the real world are the most obvious place where geospatial technologies, standards, knowledge, and best practises will play a major role. However, every metaverse space will be a massive database of physical and semantic environments that needs to be designed for efficient streaming. A metaverse space, then, can be considered an iteration of the geospatial industry’s city- or state-wide ‘digital twin’ technologies in use today for modelling & simulation, citizen engagement, and more. As such, just about any 3D Geospatial Standard will be useful in building the metaverse.
Also worth noting is that the laws of geography that underpin geospatial technologies will also apply to entirely virtual worlds: users will want maps to navigate and make sense of virtual spaces just as they do the real world. As an industry, geospatial clearly has much expertise to contribute to the creation of the metaverse.
Geospatial will be transformed by the metaverse
Despite the tropes, you won’t need a VR headset to enjoy the metaverse - it will adapt to the device accessing it.The metaverse is the internet transformed by real-time 3D technologies, but the impact of real-time 3D is also transforming geospatial. The blurring of the lines between ‘real world’ digital twins, and virtual metaverse spaces is exemplified by the integration of geospatial data into game engines, which enable the rendering of photo-realistic 3D scenes in real-time using consumer hardware.
“Game engines are really changing the game for GIS,” said Marc Petit, VP, General Manager, Unreal Engine at Epic Games, and co-host of the Building The Open Metaverse podcast, during the OGC Metaverse Ad-Hoc Session. “I think these [real-time 3D] technologies are really enabling for GIS, and the science of ‘knowing where things are’ is going to be hugely important in the metaverse.”
Philip Mielke, 3D Web Experience Product Manager at Esri, shared a similar sentiment: “We have about 4 or 5 years until the practice of GIS is fundamentally transformed by this convergence of technologies, capabilities, and expectations… We at Esri are investing a lot in game engines so that we can transmit services for consumption in [the gaming engines] Unreal and Unity.”
The sentiment that there is now a convergence of geospatial with the immersive 3D experiences of the metaverse was also echoed by Rob Clout, Sales Manager at 3D Photogrammetry company, Aerometrex, during his presentation at the Metaverse Ad-Hoc: “3D photogrammetry has become a staple input for a huge range of industries. Whether it be BIM, AEC, virtual production, or gaming, we’re starting to see 3D data really becoming prevalent pretty much everywhere.
“So, the metaverse was really just the next step for Aerometrex: it’s a culmination of what we’ve all been doing up to this point. What we’re seeing [at Aerometrex] is that the same data that’s being used for construction of the real world is now being used for construction of the virtual world. That integration of the real world and the virtual world is key: the metaverse can’t be two completely separate things.”
The integration of Geospatial data and Game Engines - in this case Cesium’s support of Epic Games’ Unreal Engine - is a crucial stepping stone toward the metaverse.As an example of the benefits of this convergence, Patrick Cozzi discussed his experience when Cesium enabled a link between their 3D geospatial streaming platform and Epic Games’ immensely popular game engine, Unreal Engine.
“Something magical happened when we built this bridge to Unreal Engine, because I feel that we made ten years’ progress overnight. I feel like suddenly the decades of investment in games technology was unlocked for geospatial, and then likewise, all of this 3D geospatial data became available to the game technology. And that’s just one example of how when we make these open and interoperable ecosystems, we can move the field forward as fast as possible.”
Indeed, if the metaverse is all about diverse 3D experiences interoperating to form a cohesive whole, open standards and knowledge will be absolutely fundamental to its creation - just as there would be no functioning Internet without open standards, there can be no functioning metaverse without them, either.
Open Standards will underpin the metaverseInnovation surrounding the metaverse, just like in other information technologies, will move quickly. The standards that will gain traction when building the metaverse will be the ones that can move with the pace of its innovation. OGC’s new standards development ethos, as seen in our OGC APIs, builds open standards that are modular, lightweight, and extensible - allowing them to evolve alongside technology without breaking, while providing a stable baseline upon which lasting innovations can be built.
However, being a novel technology, many of the standards that will solve problems in the metaverse won’t exist when the building starts. It is likely, then, that the open technologies and specifications that bubble up as best practice while the metaverse matures will be de facto standards. Recognising the importance of de facto standards, OGC years ago developed a nimble ‘Community Standard’ process that enables snapshots of de facto standards to be adopted by OGC so that they can benefit from the stability that official standardisation brings and can be better harmonised with other OGC Standards.
Community Standards can also form useful bridges that support the convergence of previously siloed industries and domains. 3D Tiles, for example, uses technology and know-how from geospatial and 3D graphics to provide a standard for streaming massive heterogenous 3D datasets that developers from both industries can follow and build to. Other OGC Community Standards relevant to the metaverse include: Indexed 3D Scene Layer (I3S) for 3D streaming; Indoor Mapping Data Format (IMDF) for mapping and navigating indoor spaces; and in the process of endorsement is Zarr, for the storage of multi-dimensional arrays of data (also known as data cubes).
OGC Community Standards can leverage the expertise of ‘outside’ industries relevant to geospatial to build a bridge between geospatial technologies and those from their industry of origin. The Community Standards process will prove useful, then, in bringing to the geospatial community the knowledge, experiences, and technologies developed by the many non-geospatial 3D and internet organisations in the early days of the metaverse.
Similarly, the liaisons and partnerships that help bring outside de facto standards in to the OGC Community Standards process will additionally serve to bring OGC Standards out to the communities that can benefit from them, and even bring those communities - and their perspectives - in to help shape Standards development and evolution.
Building the Metaverse Together
3D geospatial technologies, such as digital twins and mod-sim, will provide valuable insight, best practices, and standards for those building the broader metaverse.It is now clear that the metaverse - the internet in real-time 3D - has never been closer. Like the internet, its creation will result in technological advancements and disruptions. Geospatial is already starting to feel this as it adopts, adapts, innovates, and integrates 3D real-time technologies such as game engines and digital twins. However, the metaverse is not assured: it will only reach its true potential if, like the internet, it is based upon open standards and technologies that are easily available to all.
“I really want to see how far we can take the metaverse,” said Patrick Cozzi, “and I believe that to take it far, fast, we need open interoperability.”
As an organisation and community that’s passionate about Findable, Accessible, Interoperable, and Reusable (FAIR) data standards, OGC will continue to: provide, design, adapt, and adopt a host of standards relevant to the metaverse; offer a neutral forum for experts from across industry to meet and share knowledge; and work as a liaison and bridge-builder between other industries involved in building the metaverse and their standards organisations.
Interested in the location-aspects of the Metaverse and want to contribute your expertise and/or meet like-minded individuals? Why not attend the Metaverse DWG Session at the Digital Twins and Metaverse themed 124th OGC Member Meeting in Singapore in October, 2022.
If you’re an OGC Member and interested in contributing to creating standards, best-practice, and/or documenting use-cases for Metaverse applications, keep your eye out for the upcoming vote for the creation of an OGC Metaverse Domain Working Group. If approved, OGC Members will be able to join via the OGC Portal here. Non-members will be welcomed to join the DWG, too, by contacting OGC using the form at ogc.org/contact.
OGC Members can download the complete recording of the Metaverse ad-hoc session from the OGC Portal.
Not an OGC Member? OGC Membership will bring your organization a host of benefits. Consider joining today.
-
16:47
Building a cloud-native future at OGC
sur Open Geospatial Consortium (OGC)Tags: cloud-native geospatial, cloud, OGC's futureMarch 2022 marked my 3-year anniversary at OGC! As I look back at those 3 years – 2 of which occurred during the tumultuous COVID-19 pandemic – I feel proud of the accomplishments and changes that we have made during this time:
- We rebranded OGC with a new look (we also have a new website in the works – expect to see it in June ‘22).
- We shifted our messaging from ‘we develop standards’ to ‘why we develop standards.’
- We now identify ourselves as a collective problem-solving community of global geospatial experts and users committed to making geospatial/location information Findable Accessible Interoperable Reusable (FAIR) - via standards, innovation activities, and partnership building.
- We invested in strategic collaborative projects connecting people, organizations, systems, and data in areas that heavily impact our society and the future of our planet: disasters, climate change, health, and agriculture (to name but a few examples from the OGC Innovation Program).
- In the process, we invested in the global geospatial community by giving away 30% of OGC’s annual revenues to our member organizations
- We accelerated the mainstreaming of geospatial via our OGC APIs and targeted investment in developer resources and connections.
- In the process, we also made OGC more accessible to the startup community by creating a startup membership category at a 50% discount to our associate membership rate.
- We championed new topics that are critical to our members and the community’s growth and success – like New Space, Digital Twins, Metaverse/real-time 3D, GeoAI, and Ethics.
As I embark on my fourth year at OGC, our mission continues to be that of making geospatial information FAIR at scale! And, in 2022, I can’t think of a more scalable (or impactful!) way to do that than via cloud-native geospatial. No one can deny that ‘the cloud’ is triggering a fundamental shift in how geospatial data is stored, shared, accessed, integrated, and analyzed. If we succeed in creating a foundation for cloud-native geospatial standards:
- Imagine the radical simplification of the effort and cost needed to share and use geospatial information!
- Imagine the explosion of innovation when we make the power of geospatial accessible to everyone!
As you might have guessed, I’ve spent some time with our inaugural visiting fellow, Chris Holmes, to get to this point. I’m thankful for his efforts in bringing the community together to accelerate our path to cloud-native geospatial. And we sure need every bit of support on this journey as it requires a sustained effort to bring every piece of location information to the cloud in standard formats to empower the development of the next generation of tools that will truly unlock the value of location.
Join me on this ride of a lifetime as we take geospatial and OGC to the next level! We are kicking things off with a Cloud-Native Geospatial Community Outreach Event on April 19-20 that is already breaking registration records. In many ways, the event is the kickoff of a global activity to accelerate and advance cloud native geospatial standards. Come join us!
-
17:46
A Strong Foundation for GeoAI Innovation
sur Open Geospatial Consortium (OGC)Tags: GeoAI, Testbed-18A version of this article originally appeared in the Autumn 2021 issue of GeoConnexion International Magazine.
Far from being a “sci-fi” tech, Artificial Intelligence (AI) already plays a crucial role in many domains and is revolutionizing existing technologies. During the last decade, AI techniques such as Machine Learning (ML) and especially Deep Learning (DL), have improved significantly due to an abundance of data coupled with advancements in high-performance computing. As with so many technology domains, these new AI capabilities have reoriented and transformed GIS and Remote Sensing, providing new solutions and greatly increasing efficiency. The application of AI technologies to solve problems experienced by the geospatial community has become known as “GeoAI.”
The geospatial science community, for example, commonly uses AI and related techniques to better harness the otherwise insurmountable volume of Earth Observation (EO) data being created for geospatial analysis across various domains - such as smart cities, environmental management, and disaster management. However, the possible applications for GeoAI are only just beginning to surface.
“I would define GeoAI as a set of methods or automated entities that use geospatial data for perceiving, constructing (automating), and optimizing spaces in which humans - as well as everything else - can safely and efficiently continue their geographically referenced activities,” said Kyoung-Sook Kim, a co-chair of the OGC GeoAI Domain Working Group.
“GeoAI can bring significant benefits to drive the next generation of service innovation in many applications, including autonomous transportation, sustainable smart city planning/implementation, augmented building and energy management, self-optimized manufacturing, epidemic outbreak prediction, personal experience augmentation, and more. However, it also faces a variety of new ethical, legal, social, and technological challenges. I believe that international standards will play a pivotal role in ensuring widespread interoperability and security benefits among the various disciplines dealing with AI.”
GeoAI is most commonly used for feature identification in Earth Observation imagery, however, much more is possible.Innovations in GeoAI powered by open standards have lower development and implementation costs, reach the market sooner, and enable seamless horizontal and vertical integration/composition of GeoAI systems. Standards will also increase the safety of the computational approaches and algorithmic techniques that power the insights provided by AI engines.
With these benefits in mind, in 2018, OGC formed the Artificial Intelligence in Geoinformatics Domain Working Group (better known as the GeoAI DWG) to identify use-cases and applications related to AI in geospatial applications. The GeoAI DWG provides an open forum for broad discussion and presentation of use-cases for AI and its related technologies in the geospatial domain, with the purpose of bringing geoscientists, computer scientists, engineers, entrepreneurs, and decision makers from academia, industry, and government together to develop, share, and research the latest trends, successes, challenges, and opportunities in the field.
The GeoAI DWG is additionally tasked with investigating the feasibility and interoperability of OGC standards to support the use and re-use of geospatial data in AI applications, as well as describe gaps and issues that could lead to new geospatial standards.
“When we proposed the OGC GeoAI DWG, we found three main issues in applying AI technologies to geospatial domains. First, there are few large-scale benchmark training datasets like ImageNet, which is used for object recognition in computer vision projects. Even though there is a massive amount of satellite imagery and point cloud data available, most of them are not ready to use for geospatial Machine Learning tasks,” said Kyoung-Sook.
“Second, compared to the related fields of image processing and natural language processing, few open tools and workflows using GeoAI are shared in GIS and other business operations,” said Kyoung-Sook. “Organizations still pay for the full development and implementation cost when adapting a new GeoAI technique into a business case, rather than building upon established open tools and solutions.
“The third issue, which I think is the most important, is supporting trustworthy and safe GeoAI technology to support both the Earth’s and humanity’s well-being. The 2018 report from the World Economic Forum (Fourth Industrial Revolution for the Earth Series – Harnessing Artificial Intelligence for the Earth) states that the most important consideration in the development of AI, regardless of the AI stage, is to ensure sustainable benefits for humanity, which include being both ‘human-friendly’ and ‘Earth-friendly.’
“Going back to the first issue, I would like the GeoAI DWG to first solve the lack of training datasets and improve the sharing of knowledge from skilled people. This lack of data and knowledge-sharing causes misuse of AI and creates biased AI models. To address this, the DWG members have started to collect and analyze AI-related applications and use-cases in our communities.
“Recently, OGC has also started the formation of the new Sample Markup Language for Artificial Intelligence Machine Learning Standards Working Group (SampleML-AI/ML SWG) to develop a standard for documenting, storing, and sharing the geospatial sample data.”
A key component of ML techniques and processes is the use of sample data - data with known provenance, consistent metadata, and quality measurements - to consistently tune and train ML applications. The lack of consistent and known sample data is hindering advanced EO science applications, causing reproducibility issues, and making it difficult to compare results across studies.
Sample data should have sufficient metadata in a machine-readable standard format, and include general spatiotemporal information and sample data-specific attributes to facilitate data discovery and query. Due to their utility in Geospatial ML applications, many academic and industrial areas have focused on creating their own benchmark datasets.
However, in order to access and share these training datasets easily and effectively, an international standard for data schema and formats is required. One solution proposed by OGC is to develop a standardized Sample Markup Language for AI/ML - based on commonly used industry standards wherever possible - that allows users to document, store, and share geospatial sample data over the web following the FAIR data management principles of Findability, Accessibility, Interoperability, and Reusability.
The formation of the SampleML-AI/ML Standards Working Group, endorsed by the GeoAI DWG, is nearly complete. The working group will then be tasked with developing an OGC SampleML-AI/ML Standard, with initial geospatial sample data categories for remote sensing imagery, moving features (typically vehicle trajectories), and related spatial content.
Further to this, OGC’s Testbed 18 has a task dedicated to Machine Learning Training Datasets. To quote the Testbed 18 Call For Participation: “The goal of this Testbed-18 task is to develop the foundation for future standardization of TDS for Earth Observation applications. The task shall evaluate the status quo of training data formats, metadata models, and general questions of sharing and re-use. Several initiatives, such as ESA’s AI-Ready EO Training Datasets (AIREO) have developed suggestions that could be used for future standardization. Other initiatives focused on the development of training data repositories, such as the Radiant MLHub, an open-access geospatial training data repository where anyone can discover and download ML-ready training datasets.” To learn more about this task, see section 2.2. Machine Learning Training Datasets in the Testbed-18 Call For Participation.
GeoAI is already doing incredible things for the location industry. However, to truly reach its potential, we need to build a strong foundation of open standards for sharing sample data and creating open tools and workflows, and open best practices for ensuring the safe and ethical use of this powerful technology.
If you’re interested in contributing to creating standards, best-practice, and/or documenting use-cases for GeoAI applications, consider joining the Artificial Intelligence in Geoinformatics Domain Working Group. OGC Members can join via the OGC Portal here. Non-members are welcomed to join by contacting OGC using the form at ogc.org/contact.
Interested parties are also encouraged to attend the next OGC Member Meeting, where the OGC GeoAI DWG will likely meet (TBC), along with many other related OGC Standards and Domain Working Groups.
To learn more about and participate in Testbed-18, see the announcement for the Testbed-18 Call For Participation or visit ogc.org/testbed18. Applications for funded participation close March 31, 2022.
GeoAI shows great potential for use in energy management, self-optimized manufacturing, and many other industrial fields. -
13:59
Lowering the barrier of entry for OGC Web APIs
sur Open Geospatial Consortium (OGC)Tags: Testbed-17, DAPA, Convenience APIs, ogcapi, EO, cloud, cloud-native geospatialA version of this article originally appeared in the May/June 2021 issue of GeoConnexion Magazine under the title ‘Lowering The Barrier To Entry.’
For the last few years, OGC has been modernizing our standards to better align with web best-practices and the expectations of developers and consumers alike, resulting in our growing OGC API family of standards. Part of this effort has also been to design our standards to better take advantage of cloud infrastructure, including being able to deploy and share spatial analysis workflows across different cloud providers. This approach will benefit collaboration, transparency, and accessibility of scientific workflows - which are all cornerstones of the “Open Science” movement. I previously touched on these Earth Observation processing packages in the “App Store For Big Data” article in the July/August 2020 issue of Geoconnexion Magazine, which discussed our “Applications-to-the-data” architecture, and the Application Deployment and Execution Service (ADES) APIs.
The DAPA Convenience APIAnother part of the effort to simplify Earth Observation data processing and analysis workflows is the development of the OGC Data Access and Processing API (DAPA). Developed as a draft specification during OGC Testbed-16 in 2020, and having been tested in real-world scenarios during our Testbed-17 initiative and EO Apps Pilot, DAPA is a so-called “convenience API” that allows scientists and other geospatial analysts to run several operations on Earth Observation or other data using a single API call, in turn providing the data in a form directly ready for further analysis. This differentiates DAPA from existing APIs such as OGC API - Features or OGC API - Coverages. Where the latter two are data-centric APIs with a focus on data access and subsetting, DAPA is a user-centric API that includes data access with processing. As such, it takes lots of processing burden away from the user.
DAPA does this in a way that is mostly independent of the data location - meaning that the same call can access data stored in a local file, an in-memory structure (such as an xarray), or remotely in the cloud. Ultimately, this means that an end-user can initiate the process on one archive for one set of data, then just change the URL to have the same process(es) run on an entirely different dataset.
For example, with a single DAPA API call, you can say “please give me all the data you have for this specific area and time window, with these fields and as a result of this map algebra.” A data cube is then created on the fly and delivered to you in a form ready for you to work on in your software of choice. And, say you run that on a Landsat archive, you could then use the same API call to reproduce it on, say, a PeruSat-1 archive.
Reproducibility and Open Science
DAPA and ADES fit within a spectrum of different processing APIs available from OGC [click to enlarge]The reproducibility of the DAPA calls, just like the packagability of the ADES ‘apps,’ makes them ideal for use in support of the Open Science movement. The Open Science movement aims to make scientific research and its dissemination more accessible - by professionals and amateurs alike - and to generate transparent and accessible knowledge that is shared and developed through networks of collaboration. The movement is receiving big support from OGC Strategic Members NASA and ESA, who have also been instrumental in their sponsorship of the development of DAPA and ADES. Indeed, open standards in general, due to their enabling of the interoperability required for collaboration across organizations and disciplines, play a critical role in Open Science, too.
The synergy between OGC’s mission for FAIR (Findable, Accessible, Interoperable, and Reusable) data standards and their benefits to the reproducibility of scientific research has led to the topic of ‘Identifiers for Reproducible Science’ to be explored in this year’s Testbed-18 Initiative. The task shall develop best practices to describe all steps of a scientific workflow, including: input data from various sources such as files, APIs, data cubes; the workflow itself with the involved application(s) and corresponding parameterizations; and output data. By accurately describing the workflows of scientific studies, the studies can then be better reproduced and scrutinized - both hallmarks of the scientific process.
The Open Science movement’s desire to make scientific data and processes accessible by more than just scientists ties in well with OGC’s recent efforts to design standards with a strong end-user-centric perspective, rather than the data-provider centric view that has come to dominate earlier standardization work. This means simplifying and improving not just their form and function, but also their documentation.
User-friendly standardsRather than reading Standards Documents - which are, by their nature, meticulously defined to reduce room for interpretation and therefore tedious to read - many developers favor an approach that starts with simple documentation and examples. From there, additional features are explored stepwise, with the actual standard document often being the last resource being consulted. As location tech grows outside of the traditional geospatial spheres of expertise, this user-centric view becomes critical if the benefits of widespread standards adoption are to be realized.
With this in mind, work undertaken in Testbed-17 lowered the barrier of entry to implementing and accessing DAPA and other OGC APIs by creating sets of example code for both server- & client-side software, scripts for cloud deployment & installation, and best practice guides. To this end, Testbed-17 delivered the Engineering Report Attracting Developers: Lowering the entry barrier for implementing OGC Web APIs, which provides the knowledge necessary to develop, deploy, and execute standards-based Web APIs to Web developers, following a "How-To" philosophy with lots of hands-on experiments, examples, and instructions. Better yet, by providing scripts that illustrate the deployment and operation of API instances on local machines as well as across different cloud environments, it will make the challenge of mapping software components to cloud infrastructure a smooth experience.
In addition to the documentation, code examples, and implementations meant to make the lives of users easier, OGC has also recruited its first Developer Relations (DevRel) staff member, Joana Simoes. Joana provides an interface between OGC and the developer community, with a specific look at addressing: What do developers need from OGC? Where do they struggle? What materials can we provide to help?
All of these activities have come from OGC’s ambitions to make our standards easier to understand and implement, and to feel more tangible than our earlier work. OGC stands behind its position that standards - through making data Findable, Accessible, Interoperable, and Reusable - unlock tremendous value and power cross-collaborations that offer many benefits to society. By making the standards themselves align with the FAIR principles, we are lowering the barriers to their adoption and spreading their value further.
What do you think OGC can do to help make our standards easier to understand and implement? Let OGC and our DevRel, Joana Simoes, know at ogc.org/contact.
If you would like to help pave the way towards new levels of interoperability in areas as diverse as Open Science, New Space, Machine Learning, and Building Energy, consider participating in OGC Testbed-18. The Call For Participation closes March 17th, 2022. Funding is available for participants to recoup a significant portion of their costs.
-
23:36
7 Key Takeaways from the OGC Climate Change Special Session
sur Open Geospatial Consortium (OGC)Contributed by: Steve Liang, SensorUpThe first of a two part series, Steve Liang, founder and CTO of SensorUp shares highlights from the OGC Member Meeting Climate Special Session in December, 2021, and the many challenges and opportunities it presents from a data sharing perspective. The original article can be found here.
“If you can’t measure it, you can’t manage it.”
The quote is originally from management consultant Peter Drucker and later used by Al Gore to describe the challenge with climate change. It accurately encapsulates the theme of the Climate Change Special Session at the OGC (Open Geospatial Consortium) Climate Member Meeting for 2022.
SensorUp’s CTO Dr. Steve Liang was on the panel of data experts from NOAA, United Nations’ IPCC, NRCan and ECMWF, each of whom spoke about the current state, the challenges and the opportunities of measuring climate change data. Here, we’re highlighting seven key takeaways from the session.
1. We still have a lot of knowledge gaps when it comes to global climate data
Angelica Gutierrez, Lead Scientist for NOAA (The National Oceanic and Atmospheric Administration) talked about the struggles with obtaining accurate and timely data. “Well developed countries have access to sophisticated software, specialized equipment and skills, computing power and other essential elements to address climate change,” said Gutierrez. “Developing countries are at a disadvantage.”
It’s a known problem, and one that OGC members are already working to address. That’s another theme that emerged a number of times during the session — we are becoming more aware of our blind spots and working on solutions to mitigate them. “The 2021 OGC Disaster Pilot (that drew the largest response to an OGC pilot, historically) is addressing many of the challenges, gaps and barriers that I previously identified,” said Gutierrez.
2. The current priority is getting good data to decision-makers
In 2022, OGC is launching another pilot, the Climate Change Services Initiative, which will run from 2022 through 2026. The pilot will connect several global agencies and focus on sharing priority information. “We are rolling out the first focus area this year,” said Nils Hempelmann, an OGC Project Manager and the moderator of the climate session.
“Setting up the appropriate infrastructures to deliver information on demand to the decision makers, that’s what we are going to focus on in the beginning,” said Hempelmann of the new pilot. “And then afterwards, depending on what’s coming up and where the urgent pain points are, we are defining the next focus areas.”
3. We want to be able to more accurately measure and understand specific climate events
In recent years, several severe weather disaster events have wreaked havoc in different parts of the world. Two sets of presenters addressed this issue, using examples of weather events like atmospheric rivers and “Medicanes” (hurricanes originating in the Mediterranean) that we need to do a better job of measuring. “Recently in British Columbia, throughout the month of November, they received three storm events, each one was larger than their monthly precipitation rate,” said Cameron Wilson from Natural Resources Canada.
Wilson’s co-presenter, Simon Riopel goes onto explain the challenge of measuring and predicting an event like an atmospheric river. The challenge is in getting an accurate measure of vectors of force, which have both a magnitude and a direction.
One of the current initiatives that can be useful in learning how to solve this is the Arctic SDI (Spatial Data Infrastructure) that creates a “digital arctic” with a combination of sensor data and satellite imagery.
4. (Political) decision making is based on trust
In order to give political decision-makers what they need to make informed decisions, they have to be confident in the validity of the information.
“Decision-making is based on trust,” says Dr. Martina Stockhause, Manager of the IPCC (Intergovernmental Panel on Climate Change) Data Distribution Centre. “Political decision-makers are no experts, so they rely on trust in data and the service providers. In my view trust is built on two aspects. One is the quality of the data that is accessed. That means that the quality is documented, together with the peer review process. And the second is that the result is traceable back to its sources (with data citation and credit).”
One of the ways to achieve that is using the FAIR (Findability, Accessibility, Interoperability and Reusability) Digital Objects framework.
5. We continue to fInd new ways to use machine learning to make better weather predictions
In 2021 the WMO (World Meteorological Organization) launched a competition to improve, through machine learning and AI (artificial intelligence), how to better predict temperature and precipitation forecasts up to six weeks into the future.
The team currently leading that competition is from CRIM (the Computer Research Institute of Montreal). CRIM’s David Landry explained the team’s process of downloading, preprocessing, subsetting, and reshaping the data, before they ran their AI models and presented data predictions back to the adjudicators.
Incentivizing these research teams to continue to experiment with new models, as WHO has, will help us continue to expand our awareness of how to accurately measure and predict climate change events.
6. Estimating greenhouse gas emissions is really complex
Greenhouses gases like methane and CO2 remain difficult to measure. They can’t be seen by the human eye or typical cameras, and capturing data about them remains a challenge. To achieve a more detailed and timely monitoring of emissions in support of climate mitigation actions, the countries of the world need access to more (and more accurate) information.
“The big issue is that we can’t measure emissions directly, so these emissions need to be estimated,” said Vincent-Henri Peuch from the European Center for Medium-Range Weather Forecasts and lead of the Copernicus Satellite projects. “The problem is that it is really complex.”
Satellite images are able to show the presence of fugitive greenhouse emissions at a macro scale but “the question is, can we use this information about the concentration in the atmosphere to infer some information about the fluxes of admissions at the surface?” notes Peuch. “For that, we need to combine lots of different observations, so of course interoperability is required.”
To help with these crucial measurements, CO2M, the Copernicus Carbon Dioxide Monitoring mission, is one of Europe’s new high-priority satellite missions and will be the first to measure how much carbon dioxide is released into the atmosphere specifically through human activity.
7. Accurately measuring greenhouse emissions requires multiple data sources
Dr. Steve Liang, CTO of SensorUp and Professor at the University of Calgary, spoke about the ways that disparate data sources can be combined to help craft a clearer picture of the severity and source of fugitive emissions. “Even though we know methane leaks are bad, how can we fix them, if we can’t see them?” asked Liang. “We need methane sensors to find the locations and flow rates of the leaks. However, there’s not one sensor that is the best. Multiple types of sensors have to work together, to complement each other. They all have different temporal and spatial-temporal scales, at different levels of accuracy.”
Liang explained that a combination of data from sources like handheld instruments, fixed in-situ sensors, Terrestrial Mobile Methane Mapping Systems, airborne systems and satellite imagery can be used together, in an integrated methane sensor web, to more accurately measure, understand, and even predict harmful leaks and emissions.
If you would like to read a more complete explanation of how this methane sensor web works, you can read Dr. Liang’s blog recap of his presentation.
-
18:19
How it Went! The November 2021 Geospatial API Virtual Code Sprint
sur Open Geospatial Consortium (OGC)Tags: ogcapi, OGC API, SprintFrom November 15-17, 2021, OGC and ISO/TC 211 jointly hosted the November 2021 Geospatial API Virtual Code Sprint. The code sprint focused on the refinement of the OGC API - Features Standard and its ISO version, ISO 19168.
OGC API - Features offers the capability to serve, create, modify, and query spatial data on the Web. The Standard specifies requirements and recommendations for creating APIs that follow a standard and consistent way of sharing feature data. The Standard is divided into several parts so that a service only has to use those parts relevant to its offerings, keeping it lightweight and easier to develop and maintain.
OGC API - Features - Part 1: Core (the ISO version being ISO 19168-1:2020 Geospatial API for Features) focuses on delivery of feature content. OGC API - Features - Part 2: Coordinate Reference Systems by Reference (ISO/DIS 19168-2) adds support for coordinate reference systems other than the sole CRS specified in Part 1, WGS84.
An OGC Code Sprint is a collaborative and inclusive event driven by innovative and rapid programming with minimal process and organizational constraints to support the development of new applications and candidate standards.
Over the past three years we have been refining the process for organising and hosting OGC code sprints. For this November 2021 Geospatial API Virtual Code Sprint, a new approach was trialled: using Discord to provide video, voice, and chat facilities. Also a first for the code sprints, we ran a Mentor Stream in parallel with Breakout rooms for expert developers. The Mentor Streams were designed to help developers get started with OGC API Features and ISO 19168-1:2020 standards.
Day 1 kicked off with Welcome remarks from Dr Joana Simoes (OGC DevRel) and Peter Parslow (ISO/TC 211 Chair-elect). After the welcome remarks, Clemens Portele (interactive instruments) and Panagiotis “Peter” Vretanos (CubeWerx) presented the Goals of the sprint. On Day 1 we also had discussions on queryables and geometry simplification, and Mentor Stream sessions on Sharing data through OGC API - Features led by Dr Joana Simoes (OGC), as well as another session on Introduction to SpatioTemporal Asset Catalogs (STAC) and its use of OGC API features led by Chris Holmes (Planet), Rob Emanuele (Microsoft) and Matthew Hanson (Element 84). In between the discussions and the mentor streams, there was plenty of coding.
On Day 2, we had Mentor Stream sessions on How to Load feature data into your frontend application led by Antonio Cerciello (EarthPulse) and Testing implementations of OGC API - Features for Compliance to the Standard led by Dr Gobe Hobona (OGC). There were preliminary demos of geometry simplification through OGC API - Features. Similarly, in between the discussions and the mentor streams there was plenty more coding.
On Day 3, there was further coding, as well as a Features and Geometry JSON Lightning Talk led by Clemens Portele (interactive instruments) and Peter Vretanos (CubeWerx), as well as a final demonstration session. Check out the screenshots from the final demo at the end of this article.
Lessons Learnt- There is a need to offer JSON-FG fallback geometry to support different situations i.e. when it should be there and when it should not.
- For geometry simplification, the sprint participants started with the zoom-level, scale-denominator and a number of other parameters and then by the end of the code sprint there was agreement that we should use zoom-level.
- The sprint participants wanted to support situations in which, based on the zoom level, the server could return some features and not all of them.
- A use case for clipping was also demonstrated. For example, if you are looking at New York, you should not need to get the whole of the US coastline.
- The sprint participants also made progress on how to handle JSON Schemas.
- The sprint participants will file an issue in the JSON-FG repo to look for an extension to mark something with the clipbox (artificial segment). MapML has added the capability. The alternative is always requiring an extra border. That is whether a clipbox should be allowed to go bigger than the data. For example, whether an actual geometry in a shapefile can go beyond the -180 to 180 degrees boundaries.
- The code sprint has been good for both JSON-FG and OGC API – Features.
- JSON-FG could be considered for a conformance class for OGC API – Features only after JSON-FG has been adopted as an official OGC Standard.
- STAC has a number of deployment patterns. One of the patterns exposes an OGC API – Features interface, used for search.
- The idea is to have an alignment between STAC and OGC API – Features. This alignment will benefit OGC API – Records too.
- Some of the questions are how do we document/describe metadata for the resources offered by OGC API - Features, ISO 19168-1 and their related candidate standards such as STAC and OGC API - Records.
- STAC will be a profile of OGC API Records. The STAC community is working on a definition of a Dataset Record for STAC that would be aligned with the Record concept from OGC API Records.
- The November 2021 Geospatial API Virtual Code Sprint also demonstrated the Compatibility Mode. Example scenario: If you have a 3D building then you could use JSON-FG, but if you wanted to show a simpler geometry then the server would provide GeoJSON.
The participants made the following recommendations for future work.
Innovation Program
- About delivering MUDDI data using OGC API – Features and JSON-FG
- Development of a draft specifications for new capabilities being considered for future versions.
- Implementations of the new capabilities being considered for future extensions: Common Query Language (CQL), CRUD (Create Replace Update Delete), property selection, OpenAPI 3.1, conditional requests, web caching.
- Security for OGC API Standards Pilot (this could involve the different levels of security e.g. DCS, OpenAPI). This could be a good combination with the CRUD extension.
- Further code generation tasks in future code sprints.
Standards Program
- Completing CQL
- Further alignment between STAC and OGC API - Records
- There’s an ongoing vote in ISO for Part 2. So there may be an opportunity to do some event in the Standards Program once ISO 19168-2 has been approved.
The code sprint successfully met its objectives. The sprint participants were able to discuss and prototype new capabilities. The sprint participants also found that the tutorials and Lightning Talk provided in the Mentor Stream were helpful.
Regarding the new approach for OGC Code Sprints, the sprint participants offered the following recommendations:
- Record the tutorials, so that if a participant misses one they can catch up later
- Arrange a Beginner-to-Expert Mentor Stream that takes a developer all the way through from Getting Started to more Advanced topics. This would require a 3-day programme.
- The Discord idea was really cool!
- In the future we could use the other text channels. Perhaps the first message should explain that “we are going to use this channel in a particular way…”
To learn more about the Sprint, visit the November 2021 Geospatial API Code Sprint GitHub repository.
Screenshots from the Demonstrations Ecere GNOSIS demonstration screenshots interactive instruments ldproxy demonstration screenshots CubeWerx cubeserv demonstration screenshots -
12:00
Towards a Cloud-Native Geospatial standards baseline
sur Open Geospatial Consortium (OGC)Tags: Visiting Fellow, cloud, cloud-native geospatial, STAC, COG, ogcapi, OGC APIContributed by: Chris Holmes, OGC Visiting FellowIn my previous post I laid out the vision for Cloud-Native Geospatial, but with this post, I want to get into the details of what is needed. I’ll lay out the key areas where foundational standards are needed, and then survey the current status of each area. They range from quite well-established to quite speculative, but all are eminently achievable. And then I’ll dive deep into the area I ended up focusing on the most in these last few months as an OGC Visiting Fellow.
Components NeededThere are a few key components needed to represent diverse location information on the cloud. These sit ‘below’ an API - they are simply resources and formats. Together these components provide a solid foundation to represent most any geospatial information on the cloud. They should be compatible with APIs; they may serve to be responses to requests, as JSON resources or streaming formats. But it should also be completely possible to simply store these on a cloud storage object store (S3, GCP, etc). Those in turn will often be read by more capable APIs to do cool operations, but they don’t need to.
The core that I see is:
- Core Raster format: A solid cloud-native format to handle satellite imagery, DEMs, data products derived primarily from satellite imagery, etc.
- Multi-dimensional Raster format: A cloud format able to handle massive data cubes, like the results of weather forecasts, temperature over time and elevation, climate modeling, etc. This is the traditional space of NetCDF / HDF.
- Core vector formats: A vector data equivalent to Cloud Optimized GeoTIFF would be ideal, but the diverse requirements of fast display and on-the-fly deep analysis may not be easily combinable, so we may end up with more than one format here.
- Point cloud format: A cloud format that works like COG, but enables streaming display and on-the-fly analysis of point clouds.
- Collection & Dataset Metadata: The title, description, license, spatial and temporal bounds, keywords, etc. that enable search. For the cloud-native geospatial baseline, this should focus on being ‘crawlable’, and link to actual formats. It should support diverse data types? - ?vector data, raster data, point clouds, multi-dimensional data cubes, geo-located video, 3d portrayals, etc. - and should be flexible enough to work with any data. It should be fundamentally geospatial-focused, and not try to generically describe any data.
- Granule / Scene level / ‘asset’ Metadata: A flexible metadata object with common fields for describing particular data capture domains and linking to the actual data files.
Most of these have at least the start of an answer in our worldwide geospatial community, if not a robust solution:
- Core Raster format?:?Today this is Cloud Optimized GeoTIFF (COG). It is in the process of becoming an official OGC standard and has already seen incredible adoption in a wide variety of places. It’s really the foundational cloud-native geo format that has proven what is possible. It is worth noting that it may not be the end-all for cloud raster formats, as one could see a more optimized image format that is smaller and faster. But it would likely be some more general image format that our community adds ‘geo’ to like we did with TIFF. COGs will rule for a while, since the backward compatibility with legacy tools is hard to beat while we’re still early in the transition to cloud-first geospatial infrastructure.
- Multi-dimensional Raster format?:?There’s also already a great answer here with zarr. It is in the process of being adopted as an OGC Community Standard, with the adoption vote starting soon. It’s also being embraced by NetCDF, and has seen significant uptake in the climate community.
- Core vector formats: There is as of yet no great answer here. I’ll discuss the landscape and various possibilities in a future blog post.
- Point cloud format?: Howard Butler’s new COPC format is a ‘Range-readable, compressed, organized LASzip specification’ that hits all the same notes as Cloud-Optimized GeoTIFF and likely will see rapid adoption.
- Collection & Dataset Metadata: has a solid core with the OGC API? - ?Features ‘Collection’ construct. The STAC Collection then extends that, and the OGC API? - ?Record provides a GeoJSON equivalent that can be used as a return in search queries. But these parts haven’t quite all connected in a coherent way, and the full ‘static’ (just upload to S3) usage hadn’t been fully fleshed out. This was the main focus of my work the last few months, so I’ll dive in deeper below.
- Granule / Scene level / ‘asset’ Metadata: is where the SpatioTemporal Asset Catalog (STAC) specification that’s been my main focus the last few years has played, and it’s seeing really great adoption after recently reaching version 1.0.0.
For me, the jury is still out if a web tiles specification really belongs in a true cloud-native geospatial baseline. I believe for raster tiles (png, jpeg, etc) they don’t make sense, as a Cloud-Optimized GeoTIFF can easily be transformed on the fly into web tiles, using serverless tilers like Titiler. So the pattern is to use a good cloud-native format that enables on-the-fly rendering and processing in the form clients need. Tiles are essential for browser-based clients, but other tools are better served accessing the data directly. Once the OGC API? - Tiles standard is finalized it will likely make good sense to create a ‘Tile metadata building block’ that can serve as a cloud-native format to point clients at tiles.
For vector tiles I would consider both MVTs and PBFs as cloud-native geospatial formats, in that they can sit at rest on a cloud storage bucket and be used by various applications. But I do think there is potential for a good cloud-native vector format to work like COGs, with a serverless tile server that can render vector tiles on the fly. I’ll explore this idea more deeply in a future post on vector formats.
How do OGC APIs fit in?The OGC API initiative is a reinvention of the OGC W*S baseline into more modern JSON/REST API’s. Generally, it sits one level ‘above’ the cloud-native geospatial constructs discussed here, defining API interfaces for services that would use the cloud-native formats (but could also use more traditional formats and spatial databases). They enable a lot more, like dynamic search or on-the-fly processing of data, but also require more. Most of the cloud-native metadata constructs have been extracted from the APIs, so the cloud-native variants should be compatible with the OGC APIs, just a lot less capable (though also far easier to implement).
An ideal ecosystem would see most data stored in cloud-native geospatial formats, and then a wide array of services on top of those, with most of them implementing OGC API interfaces. In the future, it will hopefully be trivial to install a server or even a serverless function that provides the richer OGC API querying on top of the cloud-native metadata and formats.
Towards Cloud-Native Geospatial Collection MetadataAs mentioned above, a majority of my OGC Visiting Fellow time the last few months has gone into sorting out a ‘cloud-native geospatial collection’. There are a few different aspects to this.
A static OGC CollectionOne of the most powerful constructs that has emerged in STAC’s evolution is the ‘static STAC’. See the Static Spatiotemporal Asset Catalogs in Depth post for a great summary of what they are and how they work. To quote the ‘best practices’ of the 1.0.0 version of the spec:
A static catalog is an implementation of the STAC specification that does not respond dynamically to requests. It is simply a set of files on a web server that link to one another in a way that can be crawled, often stored in an cloud storage service like Amazon S3, Azure Storage and Google Cloud Storage…A static catalog can only really be crawled by search engines and active catalogs; it can not respond to queries. But it is incredibly reliable, as there are no moving parts, no clusters or databases to maintain.
It has proven to be a very popular way to publish STAC data, making good on the vision in my previous blog post of being able to upload data to the cloud and have it ‘just work’.
But while STAC pioneered clear static options for both individual items of imagery and other spatiotemporal assets, as well as collections of that type of data, there had been missing an equivalent ‘static collection’ for vector data. The OGC API? - ?Feature Collection (that STAC extends for its Collection) as specified is only part of an API response, not an independent JSON resource that can be used independently. But it was a well-designed modular part, and Clemens Portele and Peter Vretanos, the editors of the Features specification, were always supportive of pulling it out.
The OGC Collection building block [click to enlarge, or follow the link to the full page]I made a rough attempt in an experimental github repository. But then Clemens ran with another idea we had been kicking around of making true small granular building blocks from the OGC API baseline (I’ll try to make a full post on that in the future). This resulted in a very clean ‘Collection’, extracted from OGC API - Features, but written as an independent JSON resource that could be re-used in any context. And thus we have a real ‘static OGC Collection’, capable of living statically on cloud storage. This can point at a GeoJSON, GeoPackage, Shapefile or any new more cloud-native format. You can see an example of this in a repository I made to experiment with examples of static collections. That one has multiple representations of the same data as different formats, but it could easily just have one.
Records + STAC alignmentAnother large chunk of my time went into work that isn’t a direct cloud-native task: fully aligning STAC with OGC API - Records. Many have been unsure of the exact relationship between the two specs, though I always had clear (but not well communicated) thoughts. So the last few months have enabled the time to fully sync with the core Records team and get agreement on the path forward. The quick version is that Records API has a real role to play in STAC, as we’ve been holding off on ‘collection-level search’, since we wanted to fully align that with Records. But the confusing part is that a Records API can also be used to search STAC-like items, and indeed is designed for search of almost anything.
So the ‘aha’ moment was realizing that the Records spec authors have always had in mind a ‘Data Record’, which is really what STAC needs, which is a bit more specific than a totally general, flexible Record. STAC’s Collection construct is really only focused on what OGC considers ‘datasets’, it’s just that there hasn’t been a clear specification of that construct in the emerging OGC API baseline. I’ve started a pull request in the Records repo to add it and then will also propose a ‘Data Collection’ which extends the core OGC Collection with additional fields. A STAC Collection in turn should hopefully align with that Dataset Collection construct. And in the future we’ll work together to have a ‘STAC Record’ that fully aligns a STAC Item with the more general record requirements.
The other cool effect of this sync has been a really nice refactoring of the core Records API specification by Peter Vretanos. The vision has always been that Records API is a Features API, but with some additional functionality (ie sort and richer querying) and a more controlled data model. The new version makes that much more clear, emphasizing the parts that are different from the core OGC APIs, and it should be much easier to align with STAC.
Static RecordsThis work all nicely laid the foundation for the next cloud-native geospatial component, a ‘crawlable catalog’ which consists of web-accessible static records! Peter put up an example crawlable catalog (and I have a PR that expands the example with a ‘static OGC collection’ with vector data), which needs a bit more work to align with STAC, but fits with all our cloud-native geospatial principles. So we’ve now got pretty much all the pieces needed for all the right metadata we need for a cloud-native geospatial baseline. Records and Collections are basically two alternate instantiations of the same core data models, one is GeoJSON, making it easy to visualize lots of them together, and the other matches the core OGC API Collection construct that is used extensively. In the short term, the best practice will likely be to make use of both of them, but in time there will likely be tools that easily translate from one to the other, especially if we get the core data models to be completely compatible.
Bringing it all togetherSo we are tantalizingly close to the full suite of static metadata needed to handle most any cloud-native geospatial data. The main task ahead is to fully align the work in OGC API? - ?Records and - Features to be compatible with STAC, and to better describe all the necessary metadata fields. There are a few ways things are a bit different right now, so it’d be nice to simplify things between the two approaches a bit.
To help show how everything could work together I’ve put up a ‘static ogc examples’ repository to demonstrate how you could have a number of diverse datasets and formats all available from a completely static structure. I’ll keep expanding it and evolving the examples, and flesh out the readmes to show what is happening. And in the future I’ll try to do a blog post going deep into the details.
Future posts will go deeper into more of the state of actual cloud-native geospatial formats. Vector data is where I’ve spent most of my time lately, as there is not a super clear answer. I hope to also spend more time highlighting zarr and copc, as those are two really great efforts that fit in well and really round out a complete ecosystem of formats.
-
14:50
Hexagon on the value of collaboration, Standards, and OGC Membership
sur Open Geospatial Consortium (OGC)Tags: ogcapi, OGC API, Hexagon, impact, Principal MemberHexagon’s long-time support of OGC and our Standards, including our family of OGC APIs, has enabled the Company to learn from, collaborate with, and support the broader geospatial community, while also improving their product offering and being one of the first to market with support for the latest generation of geospatial standards.
“Hexagon has had a storied history in OGC, beginning with Intergraph, the first commercial member. As other members like Leica Geosystems, ERDAS, Ionic, and Luciad have come together into what is now the Hexagon membership, we continue to see a great benefit to our involvement in OGC,” said Stan Tillman, Executive Manager, Hexagon’s Safety, Infrastructure & Geospatial Division. “In particular, we are Principal members in order to provide our insight to innovation and help OGC remain relevant in the geospatial world. But our membership in general allows us to learn from others in a truly collaborative environment involving development and management.
“The work with the OGC API - Processes group is a prime example: as co-chair of the group, Hexagon has helped drive the new RESTful APIs knowing this is the direction of the developer community. However, our involvement in this group has helped us learn from others in the group, involve our developer community sooner in the process, and help in planning the next phase. This give-and-take environment provides a safe place to collaborate, which is often missing in the external communities.”
As a fruit borne of this involvement, Hexagon recently launched a new suite of products - the Power Portfolio 2022 - which is one of the first offerings on the market to support OGC API Standards. The suite includes Geoprocessing Server 2022 (as part of the ERDAS APOLLO product), which exposes its APIs using an early version of the OGC API - Processes Standard. ERDAS APOLLO also includes a web-based map client, called Catalog Explorer. Catalog Explorer already supported OGC standards like WMS, WFS, WMTS, and OGC 3DTiles, but new to this release is an OGC API - Processes dynamic client interface and support for OGC API - Features.
“Hexagon is excited to see the interest in its new Geoprocessing Server from both our customers and partners. The new Geoprocessing Server empowers many more end users at the organization to create value-added data products leveraging Spatial Models or processes from other processing engines. The aim is to leverage those experts but enable any user to execute them with nothing more than a web browser and data sourced from the catalog. Not only does this increase accessibility, but it will also, in many cases, mean the outputs are created faster by utilizing more powerful server hardware, deployed closer to the data sources.
“It has been said before, but data is one of the few assets an organization has that becomes more valuable the more it is used. This is why we see geoprocessing as an important tool: it gives value to your data."
Hexagon's Spatial Modeler provides a visual tool for creating geospatial processing models that can be executed in Geoprocessing Server by anyone in the organization [click to enlarge]“For several years, Hexagon has maintained a visual tool for building geoprocesses, so geoprocessing is certainly not foreign to the Hexagon community,” said Stan. “This capability has been exposed through internal interfaces and even OGC Web Processing Service (WPS) to a limited extent. Over the last year, we have been developing a standalone, highly scalable service to be used in executing these processes, but we were not thrilled to expose this service through the xml-based WPS. The stars started to align when OGC changed its focus to a more RESTful based approach with standards defined with REST interfaces and using geojson. We felt this fit much better in our roadmap as it pertained to geoprocessing.”
OGC API - Processes is just one Standard from the new family of OGC API Standards being developed by the OGC Community. OGC API Standards define modular APIs that spatially enable Web APIs in a consistent way - making them “building blocks for location.” OGC API Standards make use of the popular OpenAPI specification, so are easy to implement and access.
“Regarding the benefits of developing interfaces based on OGC API - Processes, we see positive gains on both the backend and frontend,” continued Stan. “First, the development of the APIs has been very easy to pick up by developers that may not have had a lot of exposure to OGC in the past. Lots of tools are available to help with auto-generated code and the use of OpenAPI Specification 3.0 has been a valuable way to provide an abstracted access to our service.
“Secondly, and maybe even more important, is the benefit of easy integration. We were able to build the Geoprocessing Server as a standalone component so other groups within Hexagon could take advantage of its use. Exposing our interface using OGC API - Processes helps us to share within our own division, but we have also found it makes it easier to convince other divisions to implement based on an international standard rather than a home grown approach.”
Hexagon also recently participated in the July 2021 OGC API Virtual Code Sprint (Engineering Report here). To ensure that all of the new OGC API standards are as developer-friendly, usable, and mature as possible before release, each draft specification is being put through one or more code sprints to test and improve their ‘readiness’ before starting the OGC standards approval process. At the Sprint, Hexagon’s Steven McDaniel demo’ed the integration of OGC API - Processes into Geoprocessing Server and Catalog Explorer and how they enable anyone in an organization to easily run geospatial processing models built in the visual workflow builder, Spatial Modeler.
Hexagon's Geoprocessing Server uses OGC API - Process to enable anyone in an organization to create value-added data products leveraging Spatial Models or processes from other processing engines [click to enlarge]“By participating in the Sprint, we were able to quickly get answers to questions pertaining to the specification and test/compare our implementation real-time with other implementations,” said Stan. “One-on-one discussions with the specification creators and other implementers helped us better understand the specification. Hopefully, our input helped to smooth out the rough points in the specification and its documentation. Our participation also led to a few new capabilities needed in the specification that Hexagon felt were minimum requirements within our product's implementation.”
With such a long history at OGC, it’s great to see that Hexagon still gains so much from, and contributes so much to, membership in the OGC Community. From their collaboration with geospatial experts, to providing and gaining insight into early technology trends and standards development, OGC is proud to count Hexagon among its Principal Members.
To learn more about how the family of OGC API Standards work together to provide modular “building blocks for location” that address both simple and the most complex use-cases, visit ogcapi.org.
To learn more about Hexagon’s Power Portfolio 2022, including its support of OGC APIs, visit hexagongeospatial.com.
To learn more about the benefits of OGC Membership, visit ogc.org/ogc/membership-value.
-
12:48
Towards a Cloud-Native OGC
sur Open Geospatial Consortium (OGC)Tags: Visiting Fellow, cloud, cloud-native geospatial, STAC, COG, ogcapi, OGC APIContributed by: Chris Holmes, OGC Visiting FellowThis is Part 1 of a two-part post. See Part 2 of this blog post, 'Towards a Cloud-Native Geospatial standards baseline', here.
About six months ago I started as the first ‘Visiting Fellow’ of the Open Geospatial Consortium. It’s been a true pleasure to explore various aspects of OGC more deeply, working with staff and members. The time has flown by, and so I wanted to share my progress and some thoughts on what comes next.
The open-ended scope of the fellowship was amazing, but I realized that I’d quickly have to focus if I was to actually make an impact while working a half day a week for six months. The theme that emerged I call ‘Cloud-Native OGC’, exploring the fundamental components that enable geospatial standards on the cloud, at a level ‘below’ APIs.
This is an evolution of the ideas I presented four years ago in a blog series called ‘Cloud-Native Geospatial’, which opened with the question ‘What would the geospatial world look like if we built everything from the ground up on the cloud?’. I’ve spent much of my time since then focused on two core parts of that transformation?—?Cloud Optimized GeoTIFF’s and SpatioTemporal Asset Catalogs. We’ve seen some incredibly early adoption of both of those formats, but it’s been mostly centered on multi-spectral satellite imagery, which is only a small corner of the overall geospatial world. So my time as an OGC Visiting Fellow has been spent on a riff on that original question: ‘What would geospatial standards look like if they were built for the cloud?’ I was able to take the time to look at the entire geospatial landscape, not just imagery, and the potential for OGC to play the key leadership role in making the Cloud-Native Geospatial vision a reality.
The Cloud Native Geospatial VisionIn digging in more, I found that OGC’s existing standards work could easily evolve to align the industry on Cloud-Native Geospatial architectures. There is no organization better situated to make it a reality than OGC: it is already trusted by every government as the steward of geospatial standards and has the largest community of geospatial experts working together, across commercial, non-profit, government, and academia.
Before I go deep into details of the standards necessary to support this, it’s worth a full articulation of the future state enabled by this vision.
The OGC mission is to ‘make location information Findable, Accessible, Interoperable, and Reusable (FAIR)’. Cloud-Native Geospatial shares the exact same goal, but leverages the cloud to radically simplify the effort needed to make geospatial data FAIR. Instead of forcing data providers to stand up, maintain, and scale their own APIs, the requirement should be as simple as using the right cloud-native geospatial format and metadata, and uploading it to any cloud. All the APIs and scalability come from the cloud itself, enabling geospatial to ride the continuous waves of innovation in the broader IT world instead of continually playing ‘catch up’.
A core aim of cloud-native geospatial is to decrease the burden on data providers, and in turn enable far more geospatial data to be FAIR. The only cost that providers should need to pay is for the cloud storage, which currently is between US$1 and US$5 a month for 100 gigabytes of data. If that core data is hosted on the cloud, then general cloud-native technologies enable the cost equation to be flipped on its head, as the users of the data pay for any computation they do, and with ‘requestor pays’ the users even pay for the egress costs.
Once the data is in the right cloud-native geospatial formats then it’s easy for anyone to stand up a traditional geospatial server, ideally one making the data available as OGC APIs. But the data itself becomes FAIR, even if it’s not in an advanced API, as the cloud plus key standards provides all that is needed to provide the data.
But things get really exciting when thinking about a whole new class of cloud-native geospatial tools that can layer on top of the core FAIR data, sitting alongside traditional geospatial services. Google Earth Engine has been operating in this future for years, enabling global scale computations that run across tens of thousands of compute nodes simultaneously to deliver answers in seconds. They have done an incredible job of curating a vast amount of data, but GEE has traditionally been a walled garden where only data ingested into GEE could tap into its capabilities. In the Cloud-Native Geospatial vision, any data on the cloud could be used by GEE (and indeed they have started to embrace the CNG vision with COG registration).
More importantly, any new cloud-scale compute tool like GEE wouldn’t need to build up its own data catalog as it could just access the same cloud-native geo formats that other tools use. Having a suite of cloud-native geospatial tools with cheap data hosting then opens up the potential for a much longer tail of geospatial data to be FAIR, as smaller organizations who have valuable information but not the wherewithal to run servers will embrace cloud-native geospatial, as putting their data on the cloud will enable many awesome tools and analysis. Access to all the world’s information in one place combined with infinite scale computation, in turn, should usher in a whole new wave of innovative tools that move beyond traditional geospatial analysis to finding broader patterns. Then the line between geospatial and non-geospatial information will blur once it is cloud native, greatly magnifying the potential impact of geospatial insight - but that’s worth a blog post of its own.
Getting to a critical mass of data that is actually usable by advanced tools of the cloud then opens up the possibility of real ‘geospatial search’. The main paradigm today is that you must know or find a particular geospatial server and then you can perform geospatial searches to find the information you need. There is no ‘google for location information’ because there is no standard format to ‘crawl’ like there is html for the web. Simple metadata and data formats that live on the cloud provide the core ‘crawlability’, particularly when they have an html equivalent that is crawlable by traditional web search engines, as described by the Spatial Data on the Web Best Practices.
The key thing cloud-native geospatial enables for search is access to the actual data - you can stream it directly into diverse tools that provide real value. Previous attempts at geospatial search engines would at best show a preview image, and often only a text description, and often the actual data wouldn’t even be available for direct download: it was just a search of the metadata. With cloud-native geospatial, the search tool can stream full resolution data directly in the browser, or link to more powerful tools that enable cloud-based analysis of the search results. The cloud-native geospatial vision focuses first on getting a critical mass of data to the cloud, but once there are sufficiently valuable masses of information it opens the possibility of a whole new class of technologies and companies focused on more innovative geospatial search tools.
Towards a Cloud-Native Geospatial Standards BaselineSo how do we actually make progress towards this vision? The core standards are much closer than one might expect. But it must be emphasized that achieving this vision will take far more work from the entire geospatial industry than just releasing some standards. We need a sustained effort to bring every piece of location information to the cloud in standard formats, to update every tool to be able to work with it, and to build together a whole new class of next-generation tools that show the power of having petabytes of information about the world in one place.
This means a very solid foundation to build on, enabling layers and layers of innovation on top. But this standard baseline must also be adaptable to the overall technology landscape, to be able to ride larger tech trends (like the shift from XML to JSON and to whatever will come next). The key to this is to build small pieces that are loosely coupled, with few moving parts, and really focusing on the truly geospatial components. This approach is one of radical simplicity, getting the core atomic units right to enable unimagined innovation on top.
I’ll dive deep into a practical plan to get to a minimal viable standards baseline for cloud-native geospatial, leveraging all the great work the OGC and broader geospatial community has done. But my work under the fellowship the last few months does show that we are potentially close to the baseline, and if we work together to realize that and build out the interoperable ecosystem of tools around it then the power of geospatial information will be available to all. Those of us who work in the field know the power, and if we can build simple cloud-native interoperability that makes it easy for anyone to access our data then the impact on the world will be immeasurable.
Interested in this vision? Be sure to read Part 2 of this blog post 'Towards a Cloud-Native Geospatial standards baseline'
-
16:31
INSPIRE and OGC APIs - Part 1: Modernising INSPIRE
sur Open Geospatial Consortium (OGC)Tags: INSPIRE, ogcapi, OGC APIWithin the broader context of the European Strategy for Data, the Joint Research Centre (the European Commission’s science and knowledge service) is collaborating with the EU Member States on modernising the technological stack for INSPIRE. As part of this modernisation, there is now an effort to use data standards “as is” rather than developing INSPIRE-specific extensions to them, thus facilitating the use of “off the shelf” software for the delivery of INSPIRE data. This desire for modern, simple, understandable standards has resulted in OGC API - Features this year joining OGC’s SensorThings API as another INSPIRE Good Practice - and other OGC APIs are likely to follow.
INSPIRE is a European Union initiative that came into force in 2007 to establish an infrastructure for spatial information in Europe that is geared to help to make spatial information more accessible and interoperable for a wide range of purposes supporting sustainable development.
“2021 has been really important for INSPIRE, as the community is currently in the process of evaluating the Directive, thus ensuring that it would remain fit for purpose within the new technological and policy context in Europe,” said Alexander Kotsev, Team leader, Joint Research Centre, European Commission. “From a technical and organisational perspective, we are actively working on an updated technical framework and further simplifying the technical requirements (see, for example, the open access journal article From Spatial Data Infrastructures to Data Spaces—A Technological Perspective on the Evolution of European SDIs).
Simpler standardsObviously, the Internet and its associated technologies have changed substantially since 2007 - a time when “the cloud” was only just being understood to mean something more than impending rain, and it was still considered strange that Apple was trying to compete with the likes of Nokia in the phone market. Indeed, the accumulation of technological changes over this time has also affected how data standards are defined, developed, implemented, and used.
“If we look into how standards and their implementation is done now, compared to 14 years ago when the INSPIRE Directive entered into force, we would immediately notice several subtle differences,” said Alexander Kotsev.
“First, in the past, standards were excessively complex and tried to capture all possible use-cases - including even the most specific niche ones. This led to a substantial overhead and made standards difficult to implement and utilise. INSPIRE, fully reliant on such complex standards, inherited and further extended the requirements, assuming that hardcoding requirements in legislation was enough for clients, servers, providers, and users to follow.
“Second, a linear process was followed with a very long implementation cycle (14 years in the case of INSPIRE). Even if successful in many aspects - like establishing a community and substantially increasing the volume of available public sector geospatial data - INSPIRE fell short in certain aspects because of the high complexity and sometimes limited support by tools.
“Now, having learned from those experiences, we want to ensure that the tools are able to handle the technical solutions well, and that stakeholders can easily access the data without having to go through hundreds of pages of specific technical documentation.
“I use the opportunity to thank the OGC for facilitating this process through the many hackathons, which successfully conceptualised these web-friendly standards,” said Alexander Kotsev.
Indeed, JRC recently published “A vision for the technological evolution of Europe’s Spatial Data Infrastructures for 2030” in the form of the report INSPIRE - A Public Sector Contribution to the European Green Deal Data Space, which mentions the benefits of lightweight, agile standards - including OGC APIs - in section 6.4, ‘Agile standards.’
Bringing BenefitsThe shift to simpler, modular, web-based standards brings with it many benefits to EU Member States, data providers, data users, and software developers alike.
“The benefits are manifold,” said Alexander Kotsev. “The good practices that we have endorsed provide a pragmatic approach for ensuring the public sector contribution to the setting up of the European Green Deal data space, as it is defined in the ambitious European Strategy for Data. Within that context, through the OGC APIs, INSPIRE data will easily be reusable together with other data such as those generated by citizens and private companies.
“First and foremost, users can now easily consume the data without having to read hundreds of pages of technical documentation, but instead are empowered to immediately start interacting with the data and build working prototypes in an agile manner.
“Second, data providers, which in our case are public authorities, can use the opportunity to modernise their technological stack and do a better job of serving their stakeholders. In the API4INSPIRE study, the pros/cons, costs and benefits for the Member States’ public authorities that provide INSPIRE data to make use of OGC APIs were assessed and recommendations were given in terms of how to start using them.
“Third, ensuring that the OGC APIs are recognised as INSPIRE download services would open the market and act as a catalyst for the many open source geospatial projects and software vendors.”
Into the FutureThe endorsement of more OGC Standards as INSPIRE Good Practice is likely to continue as more OGC API Standards are finalised, and more community members propose their - and other OGC Standards’ - endorsement.
“The endorsement of new standards in INSPIRE is entirely based on the demand of the community, which is empowered to submit proposals for good practices based on the specific needs of the different stakeholders,” said Alexander Kotsev. “In addition to OGC API - Features, the OGC SensorThings API standard is already endorsed as an INSPIRE good practice, which gives us a powerful opportunity to not only share feature data, but also spatio-temporal observation data. I am quite confident that other standards will follow soon, such as OGC API - Records. Similarly, regarding data encoding, together with the community, we are working very actively on different data encodings such as GeoJSON and more recently GeoPackage,” said Alexander Kotsev.
So, 14 years after its creation - and at a time when the need for sustainable development is greater than ever - INSPIRE has evolved to continue to provide accessible and interoperable geospatial data to the European community and beyond. And, thanks to INSPIRE’s growing support of the OGC API Family of standards, stakeholders can can access and publish it in a manner that has grown simpler and more useful over time.
Alexander Kotsev offers his thanks to his “many colleagues for their enthusiasm and hard work. In particular, big thanks to Heidi Vanparys, Jari Reni, Clemens Portele, Thijs Brentjens, Sylvain Grellet, all members of the sub-group who worked on the good practice, Michael Lutz, Marco Minghini, Jordi Escriu and the whole JRC INSPIRE team, and of course all our colleagues and friends at the OGC.”
-
17:26
Esri’s ArcGIS enables thousands of datasets, maps, and apps for location
sur Open Geospatial Consortium (OGC)Tags: OGC API, ogcapi, Esri, Principal Member, impactContributed by: Adam Martin, Esri, Jonathan Fath, OGCOpen standards aren’t just about efficiency. They allow organizations across the globe to share information effectively and securely, and can provide much needed security for data. Standards provide governments and industry alike the ability to use tons of data for a range of use cases from citizen science to Defense and Intelligence and disaster relief.
Esri’s ArcGIS implementation of OGC API - Features is a strong example of how standards can be used effectively with almost unlimited potential. It’s an open, interoperable system that drives efficiency and innovation. Like many platforms of its kind, it addresses a wide range of use cases, but also embraces interoperability through the use of open standards.
“ArcGIS products enable and amplify FAIR data principles - making our customers’ location data Findable, Accessible, Interoperable, and Reusable,” said Adam Martin, Esri product manager. “Supporting standards-based interoperability, including the new OGC API suite, is a key pillar of our product strategy and offering.”
To demonstrate ArcGIS’s recent support for OGC APIs, Esri’s Living Atlas program published a collection of U.S. National Geospatial Data Assets (NGDAs) as OGC API - Features. These NGDAs represent foundational data for the US, such as physical infrastructure, rivers and administrative boundaries, and are designated by the US Federal Geographic Data Committee. This collection is among the 8,000 datasets, maps and apps curated by the ArcGIS Living Atlas program that are of critical importance to Esri’s millions of global users.
“We are excited about this new generation of geospatial APIs and look forward to our customers using these new national foundational data services,” said Adam. “We also look forward to seeing our customers with authoritative data publish their own OGC API - Feature services using ArcGIS Online and providing feedback on both experiences.”
OGC API - Features is a multi-part standard that offers the capability to create, modify, and query spatial data on the Web and specifies requirements and recommendations for APIs that want to follow a standard way of sharing feature data. Since feature data is essentially an object with location and other geographic properties, this creates endless possibilities. By implementing OGC API - Features, ArcGIS becomes exponentially more versatile in the information that it creates and easily shares.
OGC APIs span well beyond features. Ranging from maps, tiles, and styles to routes and other crucial forms of geospatial data, they are the building blocks for location and the next generation of the consortium’s standards. The building blocks are defined not only by the requirements of the specific standards, but also through interoperability prototyping and testing in OGC's Innovation Program, a forum for OGC members to solve difficult geospatial challenges via a collaborative and agile process that is tackling R&D in initiatives such as climate, disasters, defense, and serious gaming.
“Esri has worked within OGC for decades and will continue to support OGC standards important to our users,” said Adam of Esri’s commitment to the OGC Innovation program. “User feedback is critical for our ongoing development efforts to support this suite of OGC APIs as they mature through the consensus process.”
If you’re interested in using ArcGIS products for a project, see this short Github tutorial that shows a user how to connect with ArcGIS Pro to an API that implements OGC API - Features - Part 1: Core. Developers can also use OGC API - Features with the ArcGIS JS API and Runtime SDKs. You can also find the landing page for an example OGC API - Features service here, along with the curated US NGDA collection here. If you’d like to participate in an OGC APIs sprint, developers are always welcome to attend, and they are free to the public. Sprints happen quarterly and are essential to helping finalize these much needed standards.
Users can contact Esri’s product management team responsible for implementing these open standards at open [at] esri.com.
-
19:16
Paving the way forward for Building Energy Mapping and Analytics
sur Open Geospatial Consortium (OGC)Tags: Natural Resources Canada (NRCAN), Building Energy Mapping, Climate, SDIContributed by: Eddie Oldfield, Senior Lead, Projects & Advisory Services, QUEST; Jessica Webster, Energy Planning Analyst, Natural Resources Canada; and Ryan Ahola, Environmental Scientist, Natural Resources CanadaIntroduction - The Challenge
Building energy mapping and analysis are critical for geo-targeting energy policies and programs to accelerate the transition to a low-carbon built environment and economy. Efforts to map energy use and greenhouse gas emissions from buildings are undertaken by Canadian municipalities for energy and emissions planning purposes, supported by consulting firms, universities, and sometimes non-profit organizations. Energy mapping projects are conducted independently at different times, across different scales, and using different methods and assumptions. Yet fundamentally, the data are the same: what’s required is an understanding of the building stock and its energy-related attributes including the number of buildings and units, their respective floor areas, as well as measured historical energy use and modelled predicted energy use based on different housing or building types (known as archetypes). Despite this commonality and everyone’s best efforts, there is little coordination across initiatives and no best practices or standards widely in use. This results in duplication of effort, lost energy savings, and lost opportunities for decarbonization, climate change mitigation, and climate resilience.
The Building Energy Mapping and Analytics Concept Development Study (BEMA-CDS) addressed the challenge posed by this situation by:
- Characterizing the state of development of energy mapping and analytics for the building stock broadly; and
- Informing IT architectural practices and standards to enable mapping and analytics specifically of residential energy use and efficiency.
Initiated in December 2019, with support from Natural Resources Canada (NRCan), the study drew from a number of information sources, including past research and public consultations, relevant legislation, and ongoing related initiatives. It then developed and, in February 2020, publicly released a Request for Information (RFI) that solicited responses from a wide audience of stakeholders and organizations. Questions were posed in eight subject categories concerning the building energy mapping and analytics domain.
It targeted three principal scenarios for development and application of building energy analytics and mapping:
- Community Energy and Emissions Planning
- Utility Conservation Potential Review & Demand-Side Management Program Planning
- Federal/Provincial/Territorial Building Energy - Policies, Programs, Standards, Building Codes
A series of webinars were held in mid-2020 with RFI respondents and OGC Energy and Utility Domain Working Group members to review and workshop the responses provided, to arrive at a refined understanding of current practice, and provide input to the notional architecture.
Who Can Benefit from this Study?
Oak Ridge National Laboratory’s online software suite, AutoBEM, is digital twin of the US's 129 million buildings that provides an energy model for utilities and owners to make informed decisions on how to best improve energy efficiency. Credit: ORNL, U.S. Dept. of Energy [click to enlarge]The report documenting the BEMA-CDS focuses on issues surrounding data sharing and spatial data interoperability that currently stand in the way of more fully achieving the goals and value of building energy analysis. This valuable perspective benefits many stakeholders and programs, including:
- Building scientists and energy researchers
- Suggests paths to improved data interoperability, better models, increased coordination
- Identifies potential approaches for reducing duplication, time, and costs across organizations
- Supports better quality control, and comparable data for planning and program evaluation
- Government policy analysts, regulatory authorities, and building codes and standards committees
- Identifies new approaches to inform national and provincial housing retrofit incentive programs
- Anticipates data interoperability challenges and opportunities around Alterations Codes for existing buildings
- Community energy planners
- Municipal energy planning, including design and delivery of housing efficiency programs
- A geospatial view offers the possibility of improved coordination with utilities through a common operating picture
- Utility demand-side management program managers
- Anticipates need for more geospatial analysis as more renewables come online; capital cost offsetting
- Points to “behind the meter” methods that could improve uptake for conservation and demand management (energy efficiency) programs
- The OGC Energy and Utilities Domain Working Group (DWG)
- Supports identification of potential further R&D and standards development activities beyond the timeframe of the BEMA-CDS study, for example those that address the evaluation of decarbonization strategies.
This emerging discipline sits at the convergence of many domains and areas of professional knowledge including building science, geospatial science, data science, urban planning, and energy planning. Consequently, beyond the priority usage scenarios and specific stakeholders identified above, the BEMA-CDS will also be of interest to anyone working to advance smart cities, urban digital twins, building stock energy modelling, and/or the smart grid (sometimes referred to as the digital grid). An emerging cleantech segment known as climate tech - cleantech companies tackling climate change specifically - will also be interested if their solutions relate to energy and buildings, as will venture capitalists seeking to invest in climate tech firms. Banks, who increasingly view climate risk as lending risk, should be interested in geospatial approaches to quantify the carbon-intensity of their mortgage portfolios and support assessment of lending products for energy efficiency and renewable energy technology deployment.
Some Key FindingsA critical challenge and a need identified in this study is the availability of the right spatial information elements to perform building energy analysis at various levels of generalization and specificity to improve lives and advance community goals. Across building energy mapping efforts, repetitive and non-standardized methods are used to collect, exchange, and integrate datasets. Some notable examples of persistent mapping undertaken at regional or national scales include the CityGML work in Berlin, ORNL’s digital twin AutoBEM, and the UCLA Energy Atlas. The current ad hoc approach to data collection, integration, and re-use is terribly inefficient in face of the current climate crisis.
The idea of supporting reusability and sharing of spatial data by building information as infrastructure has been around for many years under the concept of Spatial Data Infrastructures (SDI). Canada has a highly developed SDI, the Canadian Geospatial Data Infrastructure (CGDI), which uses a distributed model to support access, sharing, and use of diverse spatial information. CGDI provides critical infrastructure that Canadians rely on every day, such as weather forecasts produced by Environment and Climate Change Canada. CGDI also supports future-oriented research by allowing scientists to integrate many different forms of information through location, such as within climatedata.ca.
More recently, the concept, capabilities, and design of such infrastructure have been expanding in the age of cloud computing. It makes sense in this context to consider what an “energy Spatial Data Infrastructure” (eSDI) might look like that can support diverse building energy data needs, opportunities, stakeholders, and goals identified in the report.
Challenges IdentifiedOther common challenges enumerated by RFI respondents - and later reaffirmed by workshop participants - related to data availability, privacy, and confidentiality, as well as considerations concerning proprietary formats. Data source methods and confidence were found to be wide-ranging and poorly documented, variously measured, modeled, inferred, estimated, and assumed. A lack of access to cost estimates for retrofits was identified by respondents as a barrier to deriving benefits from energy mapping and modeling data. From a data infrastructure and reusability perspective, the lack of an overarching data framework prevents connecting the scale and resolution of spatial data to particular use scenarios. Similarly, there are no accepted schemas for applying different archetyping approaches (clustering/classification) to different use-case scenarios and levels of organizational technical and financial capacity.
Opportunities IdentifiedDespite these and other challenges, numerous opportunities were identified in the study, including data access technologies that account for privacy, confidentiality, and anonymity. Promising techniques include enclave processing, anonymization by aggregation, and noise injection, sometimes referred to as differential privacy. Adaptive classification and archetyping based on sample modeling is another potential approach to fit archetyping needs to use-cases using available data. The development of national systems for consistent energy data at multiple spatio-temporal scales is identified as an opportunity that could serve a range of use-cases. National building data for comprehensive analysis of building types, energy performance, retrofit/upgrade technologies, costs, and benefits was another data-related opportunity identified. In support of greater data interoperability, opportunities around data sharing policies and standards can be organized to support critical use-cases and stakeholders, for example mandated reporting and federated contracts. Community/utility cooperation facilitated by regional or national authorities may allow stakeholders to better understand opportunities, costs, and benefits of new technologies and energy sources including renewables.
Notional Architecture of Energy SDIOne main output of the BEMA-CDS is what’s referred to as a “notional architecture for an eSDI.” Reflecting the architecture of all spatial data infrastructures, it is organized into broad categories, also known as tiers. On the below diagram, these can be seen on the left-hand side, starting at the bottom with data and moving up to computing, services, and applications. This architecture follows the evolution of information, from data through processing to decision support. Each line to the right of the four tiers contains generic packages that apply to the energy and buildings domain. At the top of the architecture, applications synthesizing and presenting the resulting information are shown to potentially fulfil a range of auditing, program, policy, educational, and commercial decision-support functions.
A notional architecture for an energy Spatial Data Infrastructure (eSDI)Further to the notional architecture, the below diagram illustrates the current context in Canada. Existing data sources, standards, and applications are not fully interoperable with current energy modelling, benchmarking, and labelling platforms in common use. These platforms use and collect data that is spatially implicit, that is having spatial attributes such as address, city, or weather region. However, the full power of spatial data interoperability, mapping, and spatial data analytics is not fully architected, operational, or available to Canadian energy decision makers.
The study and notional architecture help to inform conversations about the potential value and opportunities to undertake further eSDI architecture development - in Canada and abroad. For example, some elements such as hierarchical, relational, and semantic schemas have yet to be developed.
Next Steps
Components of a Canadian eSDI exist but are not fully architected or interoperableAmong the many issues raised, the below learning opportunities and potential next steps stand out:
- Design of an extensible and standardized national building layer, leading to both national application and improved comparability of promising building energy analysis methods.
- Sandbox activities such as interoperability pilots, modeling the mutual benefits of information sharing and data interoperability.
- Prototypes for an eSDI, demonstrating common availability of such technologies as cloud-based energy modeling, model-driven building archetypes, and enclave protocols for addressing data privacy and proprietary constraints.
- Development of energy poverty indices that take into account fine-scale socio-economic, climate, and geographic factors in assessing the impacts and mitigation of building energy costs.
These activities can take the form of data development initiatives and interoperability experiments, which in turn can contribute to standards development. In any of these activities, cross-cutting themes can be explored and elaborated, including matching spatial-temporal resolution of input and output data, and archetyping methods matching to use-cases.
ConclusionThe BEMA-CDS study enumerates the current state of practice and identifies challenges and opportunities in building energy mapping and analytics. It also sketches out for the first time a notional architecture for an energy Spatial Data Infrastructure. Shifting effort and resources from a mindset of “let’s just get it done for this project” to one of “build it, maintain it, and continuously improve it” would produce efficiencies, improve the quality and timeliness of decision support, and accelerate innovation and job creation in the domain – all on top of cost savings and a reduction in GHG emissions. Urban digital twins, such as AutoBEM, developed and maintained by Oak Ridge National Laboratory, or Virtual Singapore, developed by the National Research Foundation, are leading examples of what can be accomplished with this mindset and current technology.
The Building Energy Mapping and Analytics: Concept Development Study Report was recently published and is freely available on the Building Energy Mapping and Analytics Concept Development Study (BEMA-CDS) page.
The authors welcome any questions concerning the study or report:
- Eddie Oldfield (Senior Lead, Projects & Advisory Services, QUEST) eoldfield [at] questcanada.org
- Jessica Webster (Energy Planning Analyst, Natural Resources Canada) jessica.webster [at] nrcan-rncan.gc.ca
- Ryan Ahola (Environmental Scientist, Natural Resources Canada) ryan.ahola [at] nrcan-rncan.gc.ca
-
11:32
A User-centric Approach to Data Cubes
sur Open Geospatial Consortium (OGC)Tags: geoconnexion, data cubeA version of this article originally appeared in the July/August 2021 edition of GeoConnexion International Magazine.
Geospatial data cubes are used frequently these days for their enabling of performant, cloud-compatible geospatial data access and analysis. But differences in their design, interfaces, and handling of temporal characteristics are causing interoperability challenges for anyone interacting with more than one solution. Such challenges are unnecessarily wasting time and money, and - from a science perspective - affecting reproducibility.
To address these challenges, the Open Geospatial Consortium (OGC) and the Group on Earth Observation (GEO) invited global data cube experts to discuss the “state of the art” and find a way forward at the Towards Data Cube Interoperability workshop. The two-day workshop, conducted in late April 2021, started with a series of pre-recorded position statements by data cube providers and data cube users. These videos served as the entry point for intense discussions that not only produced a new definition of the term ‘data cube’, but also underscored the need for a ‘user centric’ API-based approach that exposes not only the data available to the user, but also the processing algorithms that can be run on it - and allow the user to add their own. The outcomes of the Workshop have been published on the OGC & GEO Towards Data Cube Interoperability Workshop webpage.
Data cubes from the users’ perspective
Data cubes are ideally suited to cloud-based workflows, but a lack of standards makes integration of different data cubes a challenge.Existing definitions of data cubes often focus on the data structure aspect as used in computer science. In contrast to this, the Towards Data Cube Interoperability workshop emphasized the need to leave these definitions behind and focus on the user’s perspective. Users don’t care if the data is stored in a relational database, in a cloud-based object store, or a file server. What users are interested in is how they can access the data and the processing algorithms that they can apply to it. Any such standard for access should reflect this.
This led to an interesting rethinking of just what a data cube is and can be. Although it wasn’t agreed to on any formal consensus-basis, the workshop participants generally took a user-centric definition of a geo data cube to be:
“A geo data cube is a discretized model of the earth that offers the estimated values of certain variables for each cell. Ideally, a data cube is dense (i.e., does not include empty cells) with constant cell distance for its spatial and temporal dimensions. A data cube describes its basic structure, i.e., its spatial and temporal characteristics and its supported variables (aka properties), as metadata. It is further defined by a set of functions. These functions describe the available discovery, access, view, analysis, and processing methods by which the user can interact with the data cube.”
As we see, the data cube is described for the user, not the data. It does not matter if the data cube contains one, two, or three spatial dimensions, or if time is given its own dimension(s) or is just part of the metadata of an observation - or isn’t relevant to the data at all. Similarly, it doesn’t matter how the data is stored. What will unify these heterogeneous data cubes is their use of a standardised [HTTP-based] API as their method of access and interaction.
The main concern of the user is what functions the data cube instance offers to apply to the data. These functions are what primarily differentiate the user-centric data cube definition over other definitions. A user needs to understand what questions can be asked to access data that fulfills specific filter criteria, how to visualize specific (sub-) sets of data, or how to execute analytical functions and other processes on the data cube. If supported, the user also needs to understand how to add their own processes to the data cube so that they can be executed directly on the data cube without the need to transfer vast amounts of data out of the cloud.
This isn’t to say that all other characteristics - such as spatial and temporal details (e.g., being dense or sparse, overlapping or perfectly aligned, constant or inconstant distances), and property details (scales of measurements, incomplete data, interpolation methods, error values, etc.) - are of no concern to the user: they still need to be known. As such, they will be provided via the data cube API as metadata, so that the user can take them into account when assessing how best to process the data.
Interoperability through a Data Cube API
Integrating different data cubes isn’t an unsolvable puzzle.Where does this leave OGC? We think an API-based, flexible approach to standards will provide end users, software developers, and data cube operators with the best experience.
For end users: a single, simple, standardised HTTP API to learn and/or code for, no matter where the data resides, will mean an increased selection of available software (including low- or no-code platforms) will support an increased choice of data cube providers and an increased number of processing algorithms. From a scientific perspective, this means that the atmospheric scientist doesn’t additionally have to also be a Python expert, potentially using a low- or no-code platform GUI to create an algorithm that processes the data for their heatwave study across Germany. Another atmospheric scientist could then take that same processing algorithm and apply it to the UK with minimal changes - even if the required data is held by a different standards-compliant data provider. This approach greatly increases the transparency and repeatability of scientific studies and other valuable analysis tasks.
For software developers: a single, simple, standardised HTTP API means that software developers don’t have to design their own vendor-specific methods for providing access to data cubes in their software. Instead, they interact with data cubes via HTTP calls, thus benefiting from simple standard Web communication, rather than interactions on the programmatic level. By coding to an agreed-upon standard, developers can work with any compliant data cube while minimizing cube-specific adaptations. This increases the usability of the software, while decreasing the development and maintenance costs.
For data cube operators: using a single, simple, standardised HTTP API reduces development and maintenance costs while broadening the customer base. Being standards-compliant allows providers to access customers that are using any compliant software package, rather than just those using a select list of software coded to work with your specific instance. This means that more people will be coding for your data cube, even if they don’t know your service exists.
What’s next for OGC?
Data cubes come in many different shapes and sizes - a standard API would simplify their use.It’s early days yet, but you can expect to see a data cube-related API become part of our family of OGC API standards. Work towards such a data cube API builds upon the work of our Earth Observation Exploitation Platform (see An App Store For Big Data, in GeoConnexion International, July/August 2020), and is currently underway as part of OGC Testbed-17.
If you’re interested in learning about OGC’s approach to standardising access to data cubes, OGC Members can follow their early development as Observers in OGC’s Testbed-17. Alternatively, OGC Members can join the Earth Observation Exploitation Platform Domain Working Group. Detailed outcomes from the Workshop are available on the OGC & GEO Towards Data Cube Interoperability Workshop webpage.
-
18:58
Major revision of the Geospatial Information Management Standards Guide endorsed by United Nations member nations
sur Open Geospatial Consortium (OGC)Contributed by: Mark Reichardt
At the Eleventh Session of the United Nations Global Geospatial Information Management (UN-GGIM) Committee of Experts, member nations endorsed a key revision of the UN-GGIM Guide to the Role of Standards in Geospatial Information Management.Happening in late August 2021, the goal of the Guide is to “provide detailed insights on the standards and good practices necessary to establish and maintain geospatial information management systems that are compatible and interoperable with other systems within and across organizations. The Guide also underscores the importance of standards in facilitating the application of the FAIR (Findable, Accessible, Interoperable, and Reusable) data principles - promoting improved policymaking, decision making and government effectiveness in addressing key social, economic, and environmental topics, including attainment of Sustainable Development Goals”.
This endorsement represents the culmination of the work of team of over 35 members and staff of the three Standards Development Organizations (SDOs): the Open Geospatial Consortium, the International Organization for Standardization (ISO) Technical Committee 211 on Geographic Information/Geomatics, (ISO/TC 211), and the International Hydrographic Organization (IHO). The team began its 6-month revision effort in January 2021.
This revision process to the UN-GGIM Guide to the Role of Standards in Geospatial Information Management has several key goals:
Update the guide to represent recent advancements in geospatial standards, reinforcing learning resources and community implementation examples
Align the Guide with the UN-GGIM Integrated Information Framework (IGIF) – the overarching strategy and guide for implementing geospatial information management in nations worldwide, and
Transition the Guide from a traditional static publication to an easy to maintain web presence, while providing the ability for users to create a static, printed version of the document for offline use.
Committee of Experts representatives from member nations and observer organizations committed to review and comment on the revised Guide, to include identification of additional resources and community implementation examples to help implementers better understand the context and value of standards as an underpinning component of geospatial information management programs. Such resources will further help implementing organizations to establish solutions that “interoperate” to support geospatial data sharing, maintenance and decision-making across organizations, jurisdictions and systems. The SDO Guide team expects to have the Guide available as an on-line resource by January of 2022.
OGC, ISO/TC 211 and IHO member representatives and staff have dedicated their time and energy to this effort, and OGC is proud of this long standing collaboration and our commitment to supporting UN-GGIM to enable FAIR geospatial information management worldwide.
For more information about the Guide including sponsorship opportunities to help defray the costs of implementation and maintenance of the on-line Guide, contact the SDO team via email at: UNStdsGuideComments [at] lists.ogc.org.
-
22:13
Three reasons why New Space is valuable to the location community - and vice-versa
sur Open Geospatial Consortium (OGC)“Everybody has to have an interest in solving global problems. Unless one has entirely lost touch with reality. - And such people do exist.” Dennis Snower, “Who We Were”
Powering solutions that address global problems is one of the drivers behind OGC’s efforts to simplify data integration, and New Space, as an emerging domain, is simultaneously offering exciting solutions while creating integration challenges. The topic remains a point of discussion across the location community, as well as at OGC Member Meetings, often revealing many questions.
So, to help those that are not too familiar with the concept of New Space, I shall answer three common questions that help illustrate what New Space is, why it matters, and how it is evolving technologies and standards alike.
If you’re interested in anything New Space, you’re encouraged to attend our next (virtual) Member Meeting, the week of September 13, 2021. In particular, come along to our first New Space Summit - an event that will highlight the importance of the domain and why technology and standards matter. Registration is available at meet.ogc.org.
What is New Space and why is it so important to the Geospatial Community?‘New Space’ is a paradigm driven by a combination of technology and market advances such as rocket launches, small satellites, orbital planes, evolving sensors, and ground infrastructures. While the ‘commercialization’ of space isn’t a new occurrence, it is only fairly recently that space technologies have become accessible enough that the market has really ‘taken off’ (if you’ll excuse the pun), resulting in a proliferation of players, services, methodologies, and technologies.
This leads to a core challenge: the FAIRness of data, information, and other derivatives. FAIRness in this case is the foundation and the enabler for the full, efficient, and sustainable exploitation of New Space: how can one Find relevant data and information? Is it easily Accessible? Can it Interoperate with existing datasets and systems? And is it, and any derived products, Re-usable by others?
These concerns are not trivial: recent years have shown that our species as a whole needs to address critical global challenges, such as our impact on the climate and environment, more frequent natural disasters (including pandemics), compromised food security, and more. New Space technologies, with their inherently global perspective, will play a valuable role by providing much of the data needed to address these challenges.
Similarly, the New Space community needs to address smaller-scale problems of space debris as well as judge the benefits and costs of space exploration - to name just a few. There are many questions related to New Space, and as the technologies evolve, so too does the list of related challenges.
With this in mind, the location community needs an accessible, informed forum to discuss and better understand the impacts - positive and negative - of New Space on standards, data integration, and therefore effective decision making.
OGC, with its membership containing the full spectrum of experts - from designers, to providers, to end-users of New Space and related technologies - is ideally suited for just such a forum. This full spectrum perspective provides invaluable viewpoints on the practical considerations required to design useful standards, refine best practices, create valuable partnerships, and research & develop new technologies during Innovation Initiatives.
These sorts of discussions will occur at the New Space summit, part of OGC’s 120th Member Meeting, on Wednesday September 15. Register now at meet.ogc.org.
How are problems in New Space being addressed by OGC and the greater location community?OGC members lead the exploitation of new space technologies, data and solutions by:
-
Running R&D initiatives under the OGC Innovation Program.
-
Developing standards, such as OGC APIs, as well as best practices that help make the data generated by New Space technologies align with the FAIR data principles.
-
Bridging, building, linking, and involving a global community of space data providers, users, and integrators.
Leadership through the OGC Innovation Program
The OGC Innovation Program enables OGC members to solve the latest and hardest geospatial challenges via an agile, collaborative process. OGC members (sponsors and technology implementers) from across the location community come together to solve problems, produce prototypes, develop demonstrations, provide best practices, and advance the future of standards.
Recently, OGC Innovation has developed, demonstrated, and documented a large number of open standards-based technologies that address some of the challenges faced by organizations across the New Space domain, including:
-
Software architectures that allow the execution of data processing applications on the same infrastructure hosting the data (‘application to the data’ principle), minimizing data transport costs.
-
Discovery and access interfaces to optimize data handling, through our OGC APIs.
-
Data cubes to store, transport, and access multi-dimensional data efficiently.
-
Linked data approaches that help to achieve a higher level of interoperability by providing additional machine-readable information about the data.
Currently, the Innovation Program is exploring the use of New Space data in the context of natural disasters. In the current OGC Disaster Pilot 2021, more than 25 participating organisations are exploring the use of hybrid scalable cloud-based systems with advanced AI processing, machine learning algorithms, and simulation models working where earth observation and other data is already uplinked, prepared, and curated, so that they can generate analysis-ready situational data with the characteristics, scale, and speed required in the wake of a natural disaster, such as a landslide, flood, or pandemic.
Impact international Standards and Best Practice Setting
Organisations across the globe use OGC APIs and other Standards to power their applications and solutions. By engaging with the OGC’s Standards Program organizations can stay on top of current technology trends and better understand interoperability needs and requirements to unlock the full potential of New Space’s and other Earth Observation data.
Examples of OGC standards and working groups relevant and used in the Earth Observation domain are:
-
GeoAI DWG - a forward looking group bringing order to chaos on a disruptive technology.
-
Coverages SWG - it’s all about EO data cube management and analysis, connecting better data management approaches to produce analysis ready data.
-
Discrete Global Grids Systems (DGGS) DWG - these data repositories on national, continental and global scale advance the management and linkages to very large multi-resolution and multi-domain datasets. They are enabling the next generation of analytic processes to be applied to sensors, data type or coordinate reference system.
-
OGC API - Maps, - Processes, - Records, - Tiles, and the Sensorthings API - Virtually all types of working imagery involving Earth Observation data. This is the next generation of standards - data-centric versus web-service-centric, intended to simplify their implementation, discovery, and use.
-
EO Product Metadata and OpenSearch SWG - we work here to improve the findability and accessibility of Earth Observation data.
-
Sensor Web Enablement DWG - it’s about all levels of sensor sophistication, including those used on EO platforms. This group works to find best practices to better connect Space/Sky/Surface Sensor data interoperability.
OGC working groups meet up during most quarterly OGC Member Meetings to discuss developments since the previous meeting, and actions for the next. Many sessions are open to non-members, so attendance is encouraged by OGC Members and non-members alike.
How can the location community best innovate using New Space technologies?As communities of experts and technologies focused on information gathered from space continue to grow, so do the opportunities and use cases for collaborating, scaling, and sharing across the New Space domain. With this in mind, OGC and our members are committed to sharing and learning from existing knowledge, discovering new, shared interests and initiatives, and creating meaningful impacts.
Conversations concerning New Space seem to grow at every OGC Member Meeting. As such, we’re very excited to host, at our 120th Member Meeting the week of September 13, 2021, a dedicated New Space Summit. We are looking for organizations to get involved, so please register, or get in touch to learn more.
OGC is also always looking for individuals to join our many Domain and Standards Working groups to help continue to drive the conversation forward. Reach out to a teammate now to learn more about becoming a member of our global community of experts.
Registration for OGC’s 120th Member Meeting, co-located with Singapore Geospatial Festival 2021, including the New Space Summit, is available at meet.ogc.org. OGC Members and non-members are encouraged to attend.
-