Nous utilisons des cookies pour vous garantir la meilleure expérience sur notre site. Si vous continuez à utiliser ce dernier, nous considèrerons que vous acceptez l'utilisation des cookies. J'ai compris ! ou En savoir plus !.
Un Planet est un site Web dynamique qui agrège le plus souvent sur une seule page, le contenu de notes, d'articles ou de billets publiés sur des blogs ou sites Web afin d'accentuer leur visibilité et de faire ressortir des contenus pertinents aux multiples formats (texte, audio, vidéo, Podcast). C'est un agrégateur de flux RSS. Il s'apparente à un portail web.
Vous pouvez lire le billet sur le blog La Minute pour plus d'informations sur les RSS !
  • Canaux
  • Categories
  • Tags
  • Canaux

    4158 éléments (37 non lus) dans 55 canaux

    Dans la presse Dans la presse

    Géomatique anglophone

     
    • sur Global Sunlight Chart

      Publié: 10 June 2023, 10:32am CEST par Keir Clarke
      The ShadeMap: Direct Sunlight Chart is an interactive map which can calculate the number of hours of direct sunlight for any location on Earth. Unlike traditional sun charts, this map actually accounts for shadows cast by buildings and terrain. If a tall building or mountain blocks out the sun for part of the day this is taken into account in the sunlight chart for that location. The
    • sur America's Pink Migration Banana

      Publié: 9 June 2023, 10:41am CEST par Keir Clarke
      The San Francisco Chronicle has published a fascinating map which visualizes net migration in US counties. On this map counties which have seen a net loss in migration are shown in pink and those that have seen a net gain are shown in blue. The map therefore provides a great overview of where Americans are moving to and from.The Where People are Moving map reveals a pink banana running down the
    • sur gvSIG Team: Plataforma de información geográfica del Estado de Tocantins, Brasil

      Publié: 9 June 2023, 7:56am CEST

      Os traemos la presentación de otro interesante proyecto de implantación de gvSIG Online, en este caso en el Estado de Tocantins (Brasil) y que ha permitido publicar numerosas capas de información geográfica, estructurada en diversos geoportales.

      De este proyecto, además, destacamos el desarrollo de herramientas en gvSIG Online para la generación de dashboards o cuadros de mandos.

    • sur Lutra consulting: Virtual Point Clouds (VPC)

      Publié: 8 June 2023, 4:00pm CEST

      As a part of our crowdfunding campaign we have introduced a new method to handle a large number of point cloud files. In this article, we delve into the technical details of the new format, rationale behind our choice and how you can create, view and process virtual point cloud files.

      Rationale

      Lidar surveys of larger areas are often multi-terabyte datasets with many billions of points. Having such large datasets represented as a single point cloud file is not practical due to the difficulties of storage, transfer, display and analysis. Point cloud data are therefore typically stored and distributed split into square tiles (e.g. 1km x 1km), each tile having a more manageable file size (e.g. ~200 MB when compressed).

      Tiling of data solves the problems with size of data, but it introduces issues when processing or viewing an area of interest that does not fit entirely into a single tile. Users need to develop workflows that take into account multiple tiles and special care needs to be taken to deal with data near edges of tiles to avoid unwanted artefacts in outputs. Similarly, when viewing point cloud data, it becomes cumbersome to load many individual files and apply the same symbology.

      Here is an example of several point cloud tiles loaded in QGIS. Each tile is styled based on min/max Z values of the tile, creating visible artefacts on tile edges. The styling has to be adjusted for each layer separately:

      An example of individual point cloud tiles loaded in QGIS, each styled differently

      Virtual Point Clouds

      In the GIS world, many users are familiar with the concept of virtual rasters. A virtual raster is a file that simply references other raster files with actual data. In this way, GIS software then treats the whole dataset comprising many files as a single raster layer, making the display and analysis of all the rasters listed in the virtual file much easier.

      Borrowing the concept of virtual rasters from GDAL, we have introduced a new file format that references other point cloud files - and we started to call it virtual point cloud (VPC). Software supporting virtual point clouds handles the whole tiled dataset as a single data source.

      At the core, a virtual point cloud file is a simple JSON file with .vpc extension, containing references to actual data files (e.g. LAS/LAZ or COPC files) and additional metadata extracted from the files. Even though it is possible to write VPC files by hand, it is strongly recommended to create them using an automated tool as described later in this post.

      On a more technical level, a virtual point cloud file is based on the increasingly popular STAC specification (the whole file is a STAC API ItemCollection). For more details, please refer to the VPC specification that also contains best practices and optional extensions (such as overviews).

      Virtual Point Clouds in QGIS

      We have added support for virtual point clouds in QGIS 3.32 (released in June 2023) thanks to the many organisations and individuals who contributed to our last year’s joint crowdfunding with North Road and Hobu. The support in QGIS consists of three parts:

      1. Create virtual point clouds from a list of individual files
      2. Load virtual point clouds as a single map layer
      3. Run processing algorithms using virtual point clouds

      Those who prefer using command line tools, PDAL wrench includes a build_vpc command to create virtual point clouds, and all the other PDAL wrench commands support virtual point clouds as the input.

      Using Virtual Point Clouds

      In this tutorial, we are going to generate a VPC using the new Processing algorithm, load it in QGIS and then generate a DTM from terrain class. You will need QGIS 3.32 or later for this. For the purpose of this example, we are using the LiDAR data provided by the IGN France data hub.

      In QGIS, open the Processing toolbox panel, search for the Build virtual point cloud (VPC) algorithm ((located in the Point cloud data management group):

      VPC in the Processing toolbox

      VPC algorithm in the Processing toolbox

      In the algorithm’s window, you can add point cloud layers already loaded in QGIS or alternatively point it to a folder containing your LAZ/LAS files. It is recommended to also check the optional parameters:

      • Calculate boundary polygons - QGIS will be able to show the exact boundaries of data (rather than just rectangular extent)

      • Calculate statistics - will help QGIS to understand ranges of values of various attributes

      • Build overview point cloud - will also generate a single “thinned” point cloud of all your input data (using only every 1000th point from original data). The overview point cloud will be created next to the VPC file - for example, for mydata.vpc, the overview point cloud would be named mydata-overview.copc.laz

      VPC algorithm inputs, outputs and options

      VPC algorithm inputs, outputs and options

      After you set the output file and start the process, you should end up with a single VPC file referencing all your data. If you leave the optional parameters unchecked, the VPC file will be built very quickly as the algorithm will only read metadata of input files. With any of the optional parameters set, the algorithm will read all points which can take some time.

      Now you can load the VPC file in QGIS as any other layer - using QGIS browser, Data sources dialog in QGIS or by doing drag&drop from a file browser. After loading a VPC in QGIS, the 2D canvas will show boundaries of individual files - and as you zoom in, the actual point cloud data will be shown. Here, a VPC loaded together with the overview point cloud:

      VPC algorithm output

      Virtual point cloud (thinned version) generated by the VPC algorithm

      Zooming in QGIS in 2D map with elevation shading - initially showing just the overview point, later replaced by the actual dense point cloud:

      VPC algorithm output in 2D maps

      VPC output on 2D: displaying details when zooming in

      In addition to 2D maps, you can view the VPC in a 3D map windows too:

      If the input files for VPCs are not COPC files, QGIS will currently only show their boundaries in 2D and 3D views, but processing algorithms will work fine. It is however possible to use the Create COPC algorithm to batch convert LAS/LAZ files to COPC files, and then load VPC with COPC files.

      It is also worth noting that VPCs also work with input data that is not tiled - for example, in some cases the data are distributed as flightlines (with lots of overlaps between files). While this is handled fine by QGIS, for the best performance it is generally recommended to first tile such datasets (using the Tile algorithm) before doing further display and analysis.

      Processing Data with Virtual Point Clouds

      Now that we have the VPC generated, we can run other processing algorithms. For this example, we are going to convert the ground class of the point cloud to a digital terrain model (DTM) raster. In the QGIS Processing toolbox, search for Export to raster algorithm (in the Point cloud conversion group):

      VPC as an input to processing algorithms

      VPC layer can be used as an input to the point cloud processing algorithm

      This will use the Z values from the VPC layer and generate a terrain raster based on a user defined resolution. The algorithm will process the tiles in parallel, taking care of edge artefacts (at the edges, it will read data also from the neighbouring tiles). The output of this algorithm will look like this:

      Converting a VPC layer to a raster

      Converting a VPC layer to a DTM

      The output raster contains holes where there were no points classified as ground. If needed for your use case, you can fill the holes using Fill nodata algorithm from GDAL in the Processing toolbox and create a smooth terrain model for your input Virtual Point Cloud layer:

      Filling the holes in the DTM

      Filling the holes in the DTM

      Virtual point clouds can be used also for any other algorithms in the point cloud processing toolbox. For more information about the newly introduced algorithms, please see our previous blog post.

      All of the point cloud algorithms also allow setting filtering extent, so even with a very large VPC, it is possible to run algorithms directly on a small region of interest without having to create temporary point cloud files. Our recommendation is to have input data ready in COPC format, as this format provides more efficient access to data when spatial filtering is used.

      Streaming Data from Remote Sources with VPCs

      One of the very useful features of VPCs is that they work not only with local files, but they can also reference data hosted on remote HTTP servers. Paired with COPCs, point cloud data can be streamed to QGIS for viewing and/or processing - that means QGIS will only download small portions of data of a virtual point cloud, rather than having to download all data before they could be viewed or analysed.

      Using IGN’s lidar data provided as COPC files, we have built a small virtual point cloud ign-chambery.vpc referencing 16 km2 of data (nearly 700 million points). This VPC file can be loaded in QGIS and used for 2D/3D visualisation, elevation profiles and processing, with QGIS handling data requests to the server as necessary. Processing algorithms only take a couple of seconds if the selected area of interest is small (make sure to set the “Cropping extent” parameter of algorithms).

      All this greatly simplifies data access to point clouds:

      • Data producers can use very simple infrastructure - a server hosting static COPC files together with a single VPC file referencing those COPC files.

      • Users can use QGIS to view and process point cloud data as a single map layer, with no need to download large amounts of data, QGIS (and PDAL) taking care of streaming data as needed.

      We are very excited about the opportunities that virtual point clouds are bringing to users, especially when combined with COPC format and access from remote servers!

      Thanks again to all contributors to our crowdfunding campaign - without their generous support, this work would not have been possible.

      Contact us if you would like to add more features in QGIS to handle, analyse or visualise lidar data.

    • sur Dr. Jeff de La Beaujardiere receives OGC Lifetime Achievement Award

      Publié: 8 June 2023, 3:00pm CEST par Simon Chester

      The Open Geospatial Consortium (OGC) is excited to announce that Dr. Jeff de La Beaujardiere has been selected as the latest recipient of the OGC Lifetime Achievement Award. The announcement was made last night during the Executive Dinner in the U.S. Space & Rocket Center at the 126th OGC Member Meeting in GeoHuntsville, AL.

      Jeff has been selected for the award due to his long standing leadership, commitment, and support for the advancement and uptake of standards used for the dissemination of Earth Science information.

      “I’m so happy that Jeff has been selected to receive the OGC Lifetime Achievement Award,” said OGC CEO, Dr. Nadine Alameh. “Jeff is more than a champion for standards, more than an OGC Gardels award winner, and more than the WMS editor and promoter: Jeff is a role model for many of us in geospatial circles, and has directly and indirectly influenced generations of interoperability enthusiasts to collaborate, to innovate, and to solve critical problems related to our Earth. From OGC and myself, I offer our congratulations and thank Jeff for his technical work – and for being such an inspiration to so many!”

      For more than 25 years, Jeff’s support of open standards and OGC’s FAIR mission has improved access to Earth science information for countless users and decision-makers around the globe. Since 1995, Jeff has focused on improving public access to scientific data by pushing for it to be discoverable, accessible, documented, interoperable, citable, curated for long-term preservation, and reusable by the broader scientific community, external users, and decision-makers. 

      In the OGC community, Jeff is best known as the Editor of the OGC Web Map Service (WMS) Specification: a joint OGC/ISO Standard that now supports access to millions of datasets worldwide. OGC WMS was the first in the OGC Web Services suite of Standards and is the most downloaded Standard from OGC. But most importantly, the OGC WMS Standard truly revolutionized how geospatial data is shared and accessed over the web. 

      Jeff was also a major contributor to other OGC Standards, including the OGC Web Services Architecture, the OGC Web Map Context, OGC Web Terrain Service, and OGC Web Services Common. 

      Jeff’s journey with Standards – and his engagement with OGC – started back in 1998 when NASA was leading the effort to implement the Digital Earth program. At that time, Jeff championed interoperability standards as fundamental to realizing the Digital Earth vision. As part of his journey, he has provided leadership to the Geospatial Applications and Interoperability (GAI) Working Group of the U.S. Federal Geographic Data Committee and to the OGC Technical Committee. 

      In 2002 and 2003, Jeff served as Portal Manager for Geospatial One-Stop, a federal electronic government initiative. He led a team of experts in defining the requirements, architecture, and competitive solicitation for a Portal based on open standards and led an OGC interoperability initiative in developing and demonstrating a working implementation. This was a fast-paced, high-stakes effort involving many companies and agencies building on what today is the OGC Collaborative Solution & Innovation Program. 

      Jeff has received several awards for his leadership and impact in the many communities that he has participated in throughout his career, including the 2013 OGC Kenneth D. Gardels Award, the 2023 ESIP President’s Award, and the 2003 Falkenberg Award at AGU which honors “a scientist under 45 years of age who has contributed to the quality of life, economic opportunities, and stewardship of the planet through the use of Earth science information and to the public awareness of the importance of understanding our planet.”

      With this lifetime achievement award, OGC recognizes and celebrates Jeff’s lifetime of service, and his steadfast support of FAIR geospatial information for the benefit of open science, and society.

      The post Dr. Jeff de La Beaujardiere receives OGC Lifetime Achievement Award appeared first on Open Geospatial Consortium.

    • sur QGIS Blog: Plugin Update May 2023

      Publié: 8 June 2023, 9:35am CEST

      In May 22 new plugins that have been published in the QGIS plugin repository.

      Here’s the quick overview in reverse chronological order. If any of the names or short descriptions piques your interest, you can find the direct link to the plugin page in the table below the screenshot.

      Station Offset
      This plugin computes the station and offset of points along polylines and exports those values to csv for other applications
      MGP Connect
      Enable Maxar SecureWatch customers to stream imagery more effectively in QGIS.
      Triple2Layer
      this plugin imports data
      DiscordRPC Plugin for QGIS
      QGIS plugin that enables displaying a Rich Presence in Discord
      ERS
      This plugin determines calculated polluant concentrations around sensible sites’s perimeters
      IPP
      This plugin calculates IPP
      Road Vectorisation
      This plugin is designed to vectorize roads on satellite images
      Image vectorisator
      Plugin for image vectorisation
      H-RISK with noisemodelling
      Sound levels and Health risks of environmental noise
      Non_electrical_vehicle
      This plugins calculates number of non electrical vehicles
      HOT Templates and Symbology Manager
      QGIS plugin for managing HOT map templates and symbology
      Transparency Setter
      Apply the specified transparency value to both vector and raster layers, as well as layers within the selected groups in the Layer Panel
      DBGI
      Creates geopackages that match the requirements for the DBGI project
      StyleLoadSave
      Load or Save active vector layer style
      PixelCalculator
      Interactively calculate the mean value of selected pixels of a raster layer.
      GISTDA sphere basemap
      A plugin for adding base map layers from GISTDA sphere platform ( [https:]] ).
      Adjust Style
      Adjust the style of a map with a few clicks instead of altering every single symbol (and symbol layer) for many layers, categories or a number of label rules. A quick way to adjust the symbology of all layers (or selected layers) consistantly, to check out how different colors / stroke widths / fonts work for a project, and to save and load styles of all layers – or even to apply styles to another project. With one click, it allows to: adjust color of all symbols (including color ramps and any number of symbol layers) and labels using the HSV color model (rotate hue, change saturation and value); change line thickness (i.e. stroke width of all symbols / symbol borders); change font size of all labels; replace a font family used in labels with another font family; save / load the styles of all layers at once into/from a given folder.
      APLS
      This plugin performs Average Path Length Similarity
      qaequilibrae
      Transportation modeling toolbox for QGIS
      QGPT Agent
      QGPT Agent is LLM Assistant that uses openai GPT model to automate QGIS processes
      FuzzyJoinTables
      Join tables using min Damerau-Levenshtein distance
      Chandrayaan-2 IIRS
      Generates reflectance from Radiance data of Imaging Infrared Spectrometer sensor of Chandrayaan 2
    • sur The Privatisation of East Germany

      Publié: 8 June 2023, 9:16am CEST par Keir Clarke
      After the reunification of Germany in 1990 the German Democratic Republic established an agency in order to privatise East German enterprises. The Treuhandanstalt (Trust Institution) was tasked with overseeing the sale of over 8,500 state-owned companies. Under communism nearly half of all East Germans worked for the state or for state-run companies. Privatising all East German enterprises
    • sur gvSIG Team: GeoETL de la plataforma gvSIG Online: Automatizar transformaciones de datos

      Publié: 8 June 2023, 7:59am CEST

      Una mejora considerable de gvSIG Online respecto a otros productos del mercado es su ETL. Gracias a su ETL, gvSIG Online podrá realizar integraciones con otras fuentes de datos de forma ágil y sin necesidad de desarrollo.

      ETL es un plugin que se utiliza para automatizar tareas de transformaciones de datos, ya sean repetitivas o no, de manera que no sea necesario la manipulación de los datos a través de código. De esta manera, cualquier usuario es capaz de hacer una manipulación de los datos (geométricamente o no) o una homogeneización de datos que provengan de diferentes orígenes y formatos.

      Esto es posible gracias a un canvas que representará gráficamente el proceso de transformación de los datos de una manera fácil e intuitiva.

      Si queréis conocer todo el potencia de GeoETL, no os perdáis el vídeo:

    • sur Wildfires & Smoke Pollution

      Publié: 7 June 2023, 8:42am CEST par Keir Clarke
      Wildfires in Quebec and Nova Scotia are causing high levels of unhealthy air conditions across much of eastern Canada and the northeastern United States. Over 400 fires were reported to be burning in Canada on Tuesday evening resulting in smoke pollution and dangerous levels of particulate matter 2.5 over large areas.FireSmoke Canada has an interactive smoke forecast map which provides
    • sur gvSIG Team: Conociendo gvSIG Mapps

      Publié: 7 June 2023, 8:02am CEST

      Todo el mundo conoce gvSIG Desktop, el origen de el catálogo de soluciones que denominamos Suite gvSIG. Cada vez más organizaciones de todo el mundo están implantando gvSIG Online como su plataforma de gestión de datos espaciales y geoportales. Y, aunque menos conocido, crece el número de entidades que usan gvSIG Mapps… bien como app para toma de datos en campo, bien como apps desarrolladas con su framework.

      ¿Queréis saber más? Os contamos qué es realmente gvSIG Mapps:

    • sur A GIS Degree

      Publié: 30 April 2022, 10:59pm CEST par dovecaramelphobos94903

      My son decided to change majors from biodesign to GIS. I had a short moment when I almost told him not to bring all this on himself but then thought differently. I could use my years of experience to help him get the perfect degree in GIS and get a great job and still do what he wants.

      He’s one semester into the program so he really hasn’t taken too many classes. There has been the typical Esri, SPSS and Google Maps discussion, but nothing getting into the weeds. Plus he’s taking Geography courses as well so he’s got that going for him. Since he’s at Arizona State University, he’s going through the same program as I did, but it’s a bit different. When I was at ASU, Planning was in the Architectural College. Now it’s tied with Geography in a new School of Geographical Sciences & Urban Planning.

      I have to be honest, this is smart, I started my GIS career working for a planning department at a large city. The other thing I noticed is a ton of my professors are still teaching. I mean how awesome is that? I suddenly don’t feel so old anymore.

      I’ve stayed out of his classes for the past semester in hopes that he can form his own thoughts on GIS and its applicability. I probably will continue to help him focus on where to spend his electives (more Computer Science and less History of the German Empire 1894-1910). He’s such a smart kid, I know he’s going to do a great job and he was one who spent time in that Esri UC Kids Fair back when I used to go to the User Conference. Now he could be getting paid to use Esri software or whatever tool best accomplishes his goals.

      I plan to show him the Safe FME Minecraft Reader/Writer.

    • sur A GIS Degree

      Publié: 30 April 2022, 10:59pm CEST par James

      My son decided to change majors from biodesign to GIS. I had a short moment when I almost told him not to bring all this on himself but then thought differently. I could use my years of experience to help him get the perfect degree in GIS and get a great job and still do what he wants.

      He’s one semester into the program so he really hasn’t taken too many classes. There has been the typical Esri, SPSS and Google Maps discussion, but nothing getting into the weeds. Plus he’s taking Geography courses as well so he’s got that going for him. Since he’s at Arizona State University, he’s going through the same program as I did, but it’s a bit different. When I was at ASU, Planning was in the Architectural College. Now it’s tied with Geography in a new School of Geographical Sciences & Urban Planning.

      I have to be honest, this is smart, I started my GIS career working for a planning department at a large city. The other thing I noticed is a ton of my professors are still teaching. I mean how awesome is that? I suddenly don’t feel so old anymore.

      I’ve stayed out of his classes for the past semester in hopes that he can form his own thoughts on GIS and its applicability. I probably will continue to help him focus on where to spend his electives (more Computer Science and less History of the German Empire 1894-1910). He’s such a smart kid, I know he’s going to do a great job and he was one who spent time in that Esri UC Kids Fair back when I used to go to the User Conference. Now he could be getting paid to use Esri software or whatever tool best accomplishes his goals.

      I plan to show him the Safe FME Minecraft Reader/Writer.

    • sur GIS and Monitors

      Publié: 25 October 2021, 7:15pm CEST par dovecaramelphobos94903

      If there is one constant in my GIS career, it is my interest in the monitor I’m using. Since the days of being happy for a “flat screen” Trinitron monitor to now with curved flat screens, so much has changed. My first GIS Analyst position probably had the worst monitor in the history of monitors. I can’t recall the name but it had a refresh rate that was probably comparable what was seen in the 1960s. It didn’t have great color balance either, so I ended up printing out a color swatch pattern from ArcInfo and taped it on my wall so I could know what color was what.

      I stared for years at this monitor. No wonder I need reading glasses now!

      Eventually I moved up in the world where I no longer got hand-me-down hardware and I started to get my first new equipment. The company I worked for at the time shifted between Dell and HP for hardware, but generally it was dual 21″ Trinitron CRTs. For those who are too young to remember, they were the size of a small car and put off enough heat and radiation to probably shorten my life by 10 year. Yet, I could finally count on them being color corrected by hardware/software and not feel like I was color blind.

      It wasn’t sexy but it had a cool look to it. You could drop it flat to write on it like a table.

      Over 11 years ago, I was given a Wacom DTU-2231 to test. You can read more about it on that link but it was quite the monitor. I guess the biggest change between now and then is how little that technology took off. I guess if you asked me right after you read that post in 2010 what we’d be using in 2020, I would have said such technology would be everywhere. Yet we don’t see stylus based monitor much at all.

      These days my primary monitor is a LG UltraFine 24″ 4k. I pair it with another 24″ 4K monitor that I’ve had for years. Off to the other side is a generic Dell 24″ monitor my company provided. I find this setup works well for me, gone are the days where I had ArcCatalog and ArcMap open in two different monitors. Alas two of the monitors are devoted to Outlook and WebEx Teams, just a sign of my current work load.

      I’ve always felt that GIS people care more about monitors than most. A developer might be more interested in a Spotify plugin for their IDE, but a GIS Analyst care most about the biggest, brightest and crispest monitor they can get their hands on. I don’t always use FME Workbench these days, but when I do, it is full screen on the most beautiful monitor I can have. Seems perfect to me.

    • sur GIS and Monitors

      Publié: 25 October 2021, 7:15pm CEST par James

      If there is one constant in my GIS career, it is my interest in the monitor I’m using. Since the days of being happy for a “flat screen” Trinitron monitor to now with curved flat screens, so much has changed. My first GIS Analyst position probably had the worst monitor in the history of monitors. I can’t recall the name but it had a refresh rate that was probably comparable what was seen in the 1960s. It didn’t have great color balance either, so I ended up printing out a color swatch pattern from ArcInfo and taped it on my wall so I could know what color was what.

      I stared for years at this monitor. No wonder I need reading glasses now!

      Eventually I moved up in the world where I no longer got hand-me-down hardware and I started to get my first new equipment. The company I worked for at the time shifted between Dell and HP for hardware, but generally it was dual 21″ Trinitron CRTs. For those who are too young to remember, they were the size of a small car and put off enough heat and radiation to probably shorten my life by 10 year. Yet, I could finally count on them being color corrected by hardware/software and not feel like I was color blind.

      It wasn’t sexy but it had a cool look to it. You could drop it flat to write on it like a table.

      Over 11 years ago, I was given a Wacom DTU-2231 to test. You can read more about it on that link but it was quite the monitor. I guess the biggest change between now and then is how little that technology took off. I guess if you asked me right after you read that post in 2010 what we’d be using in 2020, I would have said such technology would be everywhere. Yet we don’t see stylus based monitor much at all.

      These days my primary monitor is a LG UltraFine 24″ 4k. I pair it with another 24″ 4K monitor that I’ve had for years. Off to the other side is a generic Dell 24″ monitor my company provided. I find this setup works well for me, gone are the days where I had ArcCatalog and ArcMap open in two different monitors. Alas two of the monitors are devoted to Outlook and WebEx Teams, just a sign of my current work load.

      I’ve always felt that GIS people care more about monitors than most. A developer might be more interested in a Spotify plugin for their IDE, but a GIS Analyst care most about the biggest, brightest and crispest monitor they can get their hands on. I don’t always use FME Workbench these days, but when I do, it is full screen on the most beautiful monitor I can have. Seems perfect to me.

    • sur Are Conferences Important Anymore?

      Publié: 13 July 2021, 5:00pm CEST par dovecaramelphobos94903

      Hey SOTM is going on, didn’t even know. The last SOTM I went to was in 2013 which was a blast. But I have to be honest, not only did this slip my mind, none of my feeds highlighted it to me. Not only that, apparently Esri is having a conference soon. (wait for me to go ask Google when it is) OK, they are having it next week. I used to be the person who went to as much as I could, either through attending or invited to keynote. The last Esri UC I went to was in 2015, 6 years ago. As I said SOTM was in 2013. FOSS4G, 2011. I had to look up, the last conference that had any GIS in it was the 2018 Barcelona Smart City Expo.

      So with the world opening back up, or maybe not given whatever greek letter variant we are dealing with right now, I’ve started to think about what I might want to attend and the subject matter. At the end of the day, I feel like I got more value out of the conversations outside the convention center than inside. So probably where I see a good subset of smart people hanging out. That’s why those old GeoWeb conferences that Ron Lake put on were so amazing. Meeting a ton of smart people and enjoying the conversations, rather than reading Powerpoint slides in a darkly lit room.

      Hopefully we can get back to that, just need to keep my eye out.

    • sur Are Conferences Important Anymore?

      Publié: 13 July 2021, 5:00pm CEST par James

      Hey SOTM is going on, didn’t even know. The last SOTM I went to was in 2013 which was a blast. But I have to be honest, not only did this slip my mind, none of my feeds highlighted it to me. Not only that, apparently Esri is having a conference soon. (wait for me to go ask Google when it is) OK, they are having it next week. I used to be the person who went to as much as I could, either through attending or invited to keynote. The last Esri UC I went to was in 2015, 6 years ago. As I said SOTM was in 2013. FOSS4G, 2011. I had to look up, the last conference that had any GIS in it was the 2018 Barcelona Smart City Expo.

      So with the world opening back up, or maybe not given whatever greek letter variant we are dealing with right now, I’ve started to think about what I might want to attend and the subject matter. At the end of the day, I feel like I got more value out of the conversations outside the convention center than inside. So probably where I see a good subset of smart people hanging out. That’s why those old GeoWeb conferences that Ron Lake put on were so amazing. Meeting a ton of smart people and enjoying the conversations, rather than reading Powerpoint slides in a darkly lit room.

      Hopefully we can get back to that, just need to keep my eye out.

    • sur Unreal and Unity are the new Browsers

      Publié: 12 April 2021, 5:57pm CEST par dovecaramelphobos94903

      Someone asked me why I hadn’t commented on Cesium and Unreal getting together. Honestly , no reason. This is big news honestly. HERE, where I work, is teaming up with Unity to bring the Unity SDK and the HERE SDK to automotive applications. I talk about how we used Mapbox Unity SDK at Cityzenith (though I have no clue if they still are). Google and Esri have them too. In fact both Unreal and Unity marketplaces are littered with data sources you can plug in.

      HERE Maps with Unity

      This is getting at the core of what these two platforms could be. Back in the day we had two browsers, Firefox and Internet Explorer 6. Inside each we had many choices of mapping platforms to use. From Google and Bing to Mapquest and Esri. In the end that competition to make the best API/SDK for a mapping environment drove a ton of innovation. What Google Maps looks like and does in 2021 vs 2005 is amazing.

      This brings up the key as to what I see happening here. We’ll see the mapping companies (or companies that have mapping APIs) deliver key updates to these SDK (which today are pretty limited in scope) because they have to stay relevant. Not that web mapping is going away at any point, but true 3D world and true Digital Twins require power that browsers cannot provide even in 2021. So this rush to become the Google Maps of 3D engines is real and will be fun to watch.

      Interesting in that Google is an also-ran in the 3D engine space, so there is so much opportunity for the players who have invested and continue to invest in these markets without Google throwing unlimited R&D dollars against it. Of course it only takes on press release to change all that so don’t bet against Google.

    • sur Unreal and Unity are the new Browsers

      Publié: 12 April 2021, 5:57pm CEST par James

      Someone asked me why I hadn’t commented on Cesium and Unreal getting together. Honestly , no reason. This is big news honestly. HERE, where I work, is teaming up with Unity to bring the Unity SDK and the HERE SDK to automotive applications. I talk about how we used Mapbox Unity SDK at Cityzenith (though I have no clue if they still are). Google and Esri have them too. In fact both Unreal and Unity marketplaces are littered with data sources you can plug in.

      HERE Maps with Unity

      This is getting at the core of what these two platforms could be. Back in the day we had two browsers, Firefox and Internet Explorer 6. Inside each we had many choices of mapping platforms to use. From Google and Bing to Mapquest and Esri. In the end that competition to make the best API/SDK for a mapping environment drove a ton of innovation. What Google Maps looks like and does in 2021 vs 2005 is amazing.

      This brings up the key as to what I see happening here. We’ll see the mapping companies (or companies that have mapping APIs) deliver key updates to these SDK (which today are pretty limited in scope) because they have to stay relevant. Not that web mapping is going away at any point, but true 3D world and true Digital Twins require power that browsers cannot provide even in 2021. So this rush to become the Google Maps of 3D engines is real and will be fun to watch.

      Interesting in that Google is an also-ran in the 3D engine space, so there is so much opportunity for the players who have invested and continue to invest in these markets without Google throwing unlimited R&D dollars against it. Of course it only takes on press release to change all that so don’t bet against Google.

    • sur Arrays in GeoJSON

      Publié: 6 April 2021, 1:00pm CEST par dovecaramelphobos94903

      So my last post was very positive. I figured out how to relate the teams that share a stadium with the stadium itself. This was important because I wanted to eliminate the redundant points that were on top of each other. For those who don’t recall, I have an example in this gist:

      Now I mentioned that there were issues displaying this in GIS applications and was promptly told I was doing this incorrectly:

      An array of <any data type> is not the same as a JSON object consisting of an array of JSON objects. If it would have been the first, I'd have pointed you (again) to QGIS and this widget trick [https:]] .

      — Stefan Keller (@sfkeller) April 4, 2021

      If you click on that tweet you’ll see basically that you can’t do it the way I want and I have to go back to the way I was doing it before:

      Unfortunately, the beat way is to denormalise. Redundant location in many team points.

      — Alex Leith (@alexgleith) April 4, 2021

      I had a conversation with Bill Dollins about it and he sums it up susinctly:

      I get it, but “Do it this way because that’s what the software can handle” is an unsatisfying answer.

      So I’m stuck, I honestly don’t care if QGIS can read the data, because it can. It just isn’t optimal. What I do care about is an organized dataset in GeoJSON. So my question that I can’t get a definitive answer, “is the array I have above valid GeoJSON code?”. From what I’ve seen, yes. But nobody wants to go on record as saying absolutely. I could say, hell with it I’m moving forward but I don’t want to go down a dead end road.

    • sur Arrays in GeoJSON

      Publié: 6 April 2021, 1:00pm CEST par James

      So my last post was very positive. I figured out how to relate the teams that share a stadium with the stadium itself. This was important because I wanted to eliminate the redundant points that were on top of each other. For those who don’t recall, I have an example in this gist:

      Now I mentioned that there were issues displaying this in GIS applications and was promptly told I was doing this incorrectly:

      An array of <any data type> is not the same as a JSON object consisting of an array of JSON objects. If it would have been the first, I'd have pointed you (again) to QGIS and this widget trick [https:]] .

      — Stefan Keller (@sfkeller) April 4, 2021

      If you click on that tweet you’ll see basically that you can’t do it the way I want and I have to go back to the way I was doing it before:

      Unfortunately, the beat way is to denormalise. Redundant location in many team points.

      — Alex Leith (@alexgleith) April 4, 2021

      I had a conversation with Bill Dollins about it and he sums it up susinctly:

      I get it, but “Do it this way because that’s what the software can handle” is an unsatisfying answer.

      So I’m stuck, I honestly don’t care if QGIS can read the data, because it can. It just isn’t optimal. What I do care about is an organized dataset in GeoJSON. So my question that I can’t get a definitive answer, “is the array I have above valid GeoJSON code?”. From what I’ve seen, yes. But nobody wants to go on record as saying absolutely. I could say, hell with it I’m moving forward but I don’t want to go down a dead end road.

    • sur GeoJSON Ballparks as JSON

      Publié: 2 April 2021, 9:42pm CEST par dovecaramelphobos94903

      In a way it is good that Sean Gillies doesn’t follow me anymore. Because I can hear his voice in my head as I was trying to do something really stupid with the project. But Sheldon helps frame what I should be doing with what I was doing:

      tables? what the? add , teams:[{name:"the name", otherprop: …}, {name:…}] to each item in the ballparks array and get that relational db BS out of your brain

      — Sheldon (@tooshel) April 2, 2021

      Exactly! What the hell? Why was I trying to do something so stupid when the while point of this project is baseball ballparks in GeoJSON. Here is the problem in a nutshell and how I solved it. First off, let us simply the problem down to just one ballpark. Salt River Fields at Talking Stick is the Spring Training facility for both the Arizona Diamondbacks and the Colorado Rockies. Not only that, but there are Fall League and Rookie League teams playing there. Probably even more that I still haven’t researched. Anyway, GeoJSON Ballparks looks like this today when you just want to see that one stadium.

      Let’s just say I backed myself in this corner by starting by only having MLB ballparks, none of which at the time of the project were shared between teams.

      It’s a mess right? Overlapping points, so many opportunities to screw up names. So my old school thought was just create a one-to-many relationship between the GeoJSON points and some external table. Madness! Seriously, what was I thinking? Sheldon is right, I should be doing a JSON array for the teams. Look how much nicer it all looks when I do this!

      Look how nice that all is? So easy to read and it keeps the focus on the ballparks.

      As I said in the earlier blog post.

      The problem now is so many teams, especially in spring training, minor leagues and fall ball, share stadiums, that in GeoJSON-Ballparks, you end up with multiple dots on top of each other. No one-to-many relationship that should happen.”

      The project had pivoted in a way I hadn’t anticipated back in 2014 and it was a sure a mess to maintain. So now I can focus on fixing the project with the Minor League Baseball realignment that went on this year and get an updated dataset in Github very soon.

      One outcome of doing this nested array is that many GIS tools don’t understand how to display the data. Take a look at geojson.io:

      geojson.io compresses the array into one big JSON-formatted string. QGIS and Github do this also. It’s an issue that I’m willing to live with. Bill Dollins shared the GeoJSON spec with me to prove the way I’m doing is correct:

      3.2.  Feature Object
         A Feature object represents a spatially bounded thing.  Every Feature
         object is a GeoJSON object no matter where it occurs in a GeoJSON
         text.
         o  A Feature object has a "type" member with the value
            "Feature".
         o  A Feature object has a member with the name 
            "geometry".  The value of the geometry member SHALL
            be either a Geometry object as defined above or, in 
            the case that the Feature is unlocated, a JSON 
            null value.
         o  A Feature object has a member with the name 
            "properties".  The value of the properties member is
            an object (any JSON object or a JSON null value).

      ANY JSON OBJECT! So formatting the files this way is correct and the way it should be done. I’m going to push forward on cleaning up GeoJSON Ballparks and let the GIS tools try and catch up.

    • sur GeoJSON Ballparks and MLB Minor League Realignment

      Publié: 7 March 2021, 10:15pm CET par dovecaramelphobos94903

      **UPDATE** – See the plan.

      Boy, where to start? First, for those who haven’t been following, this happened over the winter.

      Major League Baseball announced on Friday (February 12, 2021) a new plan for affiliated baseball, with 120 Minor League clubs officially agreeing to join the new Professional Development League (PDL). A full list of Major League teams and their new affiliates, one for each level of full-season ball, along with a complex league (Gulf Coast and Arizona) team, can be found below.

      Minor League Baseball

      What does that mean? Well for GeoJSON Ballparks basically every minor league team is having a modification to it. At a minimum, the old minor league names have changed. Take the Pacific Coast League that existed for over 118 years is now part of Triple-A West which couldn’t be a more boring name. All up and down the minor leagues, the names now just reflect the level of minor league the teams are. And some teams have moved from AAA to Single A and all around.

      I usually wait until Spring Training is just about over to update the minor league teams but this year it almost makes zero sense. I’ve sort of backed myself into a spatial problem, unintended when I started. Basically, the project initially was just MLB teams and their ballparks. The key to that is that the teams drove the dataset, not the ballparks even though the title of the project clearly said it was. As long as nobody shared a ballpark, this worked out great. The problem now is so many teams, especially in spring training, minor leagues and fall ball, share stadiums, that in GeoJSON-Ballparks, you end up with multiple dots on top of each other. No one-to-many relationship that should happen.

      So, I’m going to use this minor league realignment to fix what I should have fixed years ago. There will be two files in this dataset moving forward. One GeoJSON file of the locations of a ballpark and then a CSV (or other format) file containing the teams. Then we’ll just do the old fashioned relate between the two and the world is better again.

      I’m going to fork GeoJSON-Ballparks into a new project and right the wrongs I have done against good spatial data management. I’m finally ready to play centerfield!

    • sur I’m Here at HERE

      Publié: 22 February 2021, 5:46pm CET par dovecaramelphobos94903
      Pièce jointe: [télécharger]

      Last Tuesday I started at HERE Technologies with the Professional Services group in the Americas. I’ve probably used HERE and their legacy companies data and services for most of my career so this is a really cool opportunity to work with a mobile data company.

      I’m really excited about working with some of their latest data products including Premier 3D Cities (I can’t escape Digital Twins).

      Digital Twins at HERE
    • sur Digital Twins and Unreal Engine

      Publié: 17 November 2020, 6:36pm CET par dovecaramelphobos94903

      I’ve had a ton of experience with Unity and Digital Twins but I have been paying attention to Unreal Engine. I think the open nature of Unity is probably more suited for the current Digital Twin market, but competition is so important for innovation. This project where Unreal Engine was used to create a digital clone of Adelaide is striking but the article just leaves me wanting for so much more.

      A huge city environment results in a hefty 3D model. Having strategies in place to ease the load on your workstation is essential. “Twinmotion does not currently support dynamic loading of the level of detail, so in the case of Adelaide, we used high-resolution 3D model tiles over the CBD and merged them together,” says Marre. “We then merged a ring of low-resolution tiles around the CBD and used the lower level of detail tiles the further away we are from the CBD.”

      Well, that’s how we did it at Cityzenith. Tiles are the only way to have the detail one needs in these 3D worlds and one that geospatial practitioners are very used to dealing with their slippy maps. The eye-candy that one sees in that Adelaide project is amazing. Of course, scaling one city out is hard enough but doing so across a country or the globe is another. Still, this is an amazing start.

      Seeing Epic take Twinmotion and scale it out this way is very exciting because as you can see from that video above, it really does look photorealistic.

      But this gets at the core of where Digital Twins have failed. It is so very easy to do the above, crate an amazing looking model of a city, and drape imagery across it. It is a very different beast to actually create a Digital Twin where these buildings are not only linked up to external IoT devices and services but they should import BIM models and generalize as needed. They do so some rudimentary analysis of shadows which is somewhat interesting, but this kind of stuff is so easy to do and there are so many tools to do it that all this effort to create a photorealistic city seems wasted.

      I think users would trade photorealistic cities for detailed IoT services integration but I will watch Aerometrex continue to develop this out. Digital Twins are still stuck in sharing videos on Vimeo and YouTube, trying to create some amazing realistic city when all people want is visualization and analysis of IoT data. That said, Aerometrex has done an amazing job building this view.

    • sur Moving Towards a Digital Twin Ecosystem

      Publié: 10 November 2020, 9:01pm CET par dovecaramelphobos94903

      Smart Cities really start to become valuable when they integrate with Digital Twins. Smart Cities do really well with transportation networks and adjusting when things happen. Take, for example, construction on an important Interstate highway that connects the city core with the suburbs causes backups and a smart city can adjust traffic lights, rail, and other modes of transportation to help adjudicate the problems. This works really well because the transportation system talk to each other and decisions can be made to refocus commutes toward other modes of transportation or other routes. But unfortunately, Digital Twins don’t do a great job talking to Smart Cities.

      Photo by Victor Garcia on Unsplash

      A few months ago I talked about Digital Twins and messaging. The idea that:

      Digital twins require connectivity to work. A digital twin without messaging is just a hollow shell, it might as well be a PDF or a JPG. But connecting all the infrastructure of the real world up to a digital twin replicates the real world in a virtual environment. Networks collect data and store it in databases all over the place, sometimes these are SQL-based such as Postgres or Oracle, and other times they are simple as SQLite or flat-file text files. But data should be treated as messages back and forth between clients.

      This was in the context of a Digital Twin talking to services that might not be hardware-based, but the idea stands up for how and why a Digital Twin should be messaging the Smart City at large. Whatever benefits a Digital Twin gains from an ecosystem that collects and analyzes data for decision-making those benefits become multiplied when those systems connect to other Digital Twins. But think outside a group of Digital Twins and the benefit of the Smart City when all these buildings are talking to each other and the city to make better decisions about energy use, transportation, and other shared infrastructure across the city or even the region (where multiple Smart Cities talk to each other).

      When all these buildings talk to each other, they can help a city plan, grow and evolve into a clean city.

      What we don’t have is a common data environment (CDE) that cities can use. We have seen data sharing on a small scale in developments but not on a city-wide or regional scale. To do this we need to agree on model standards that allow not only Digital Twins to talk to each other (Something open like Bentley’s iTwin.js) and share ontologies. Then we need that Smart City CDE where data is shared, stored, and analyzed at a large scale.

      One great outcome of this CDE is all this data can be combined with City ordinances to give tools like Delve from Sidewalk Labs even more data to create their generative design options. Buildings are not a bubble in a city and their impacts on the city extend out beyond the boundaries of the parcel they are built on. That’s what so exciting about this opportunity, manage assets in a Digital Twin on a micro-scale, but share generalized data about those decisions to the city at large which then can share them with other Digital Twins.

      Graphic showing chart of change over time

      And lastly, individual Smart Cities aren’t bubbles either. They have huge impacts on the region or even the country that they are in. If we can figure out how to create a national CDE, one that covers a country as diverse as the United States, we can have something that can even benefit the world at large. Clean cities are the future and thinking about them on a small scale will only result in the gentrification of affluent areas and leave less well areas behind. I don’t want my children to grow up in a world like that and we have the processes in place to ensure that they have a better place than use to grow up in.

    • sur Quilting "Golden Light"

      Publié: 14 December 2014, 2:45am CET par Aluminum Loaf

      After the success of the first experiment with printing one of my mom's photographs on fabric and quilting it last July (see Quilting Fuchsia), we selected seven more images and had those printed at Spoonflower's largest possible size (about 27 x 40 inches without distorting the image). I've since shipped and carried those pieces of fabric all around the United States and even into Canada, twice. Two weeks ago, I finally managed to find the time to sit down and start to quilt again—truly one of my very favourite/favorite activities. 

      This is the resulting quilt that I created from the photograph "Golden Light" by Elizabeth Root Blackmer (you can see the original image in the middle of her Frozen gallery at BrootPhoto.com). The image is bubbles of air trapped in the ice of a frozen pond.

      GoldenLight_Front

      First of all, I had to find a fabric store. I was staying at my house in Nova Scotia when I finally found the time to quilt, and although I have spent extended periods of time here at the house over the past seven years (I took over the family home), I had never tried to find a place to source fabric here. I mentioned to my farming neighbour/neighbor that I need to ask his wife about a place to get fabric, and he looked at me like I was utterly obtuse. He said something along the lines of: "Everyone goes to Avonport Discount Fabric Centre over in Avonport, up behind the school—how do you not know that?" LOL.

      So I asked a few more people over the next few days as I finished up my stack of work-work, and every single person (male and female) said the same thing: go there! So I did. Well, it doesn't look like much from the outside, and it shares a big dirt parking lot with the used auto parts store next door, so I was reasonably skeptical, but oh, what an epic pleasure this place is. Fantastic materials, ample supplies, helpful staff, great prices, and generally, like so many places here in Nova Scotia, a meeting place for friends and family. I've been back quite a number of times since; it's just down the road from my house—not ten minutes away! 

      GoldenLight_AvonportFabrics

      Anyway, I found the most perfect backing fabric and thread for my project at this lovely store, and had a few wonderful quilting-related conversations with the ladies there while I wandered around looking at everything.

      GoldenLight_Materials

      For some odd reason, I decided to use the dining table as my quilting space (but it's just me here this time, so I'm not in anyone's way). It might seem odd, given that I made myself a quilting area in another room, but this space is always warm, and that space doesn't need to be heated, so I ended up out here. By the time this quilt was done, five of the six chairs had been moved away from the table to give me access to all sides of the quilt.

      GoldenLight_WorkSpace

      And here is the original printed image.

      GoldenLight_OriginalFabric

      I started by quilting the ice bubbles with gold thread and used a twisty stitch-line within each circle to make them stand out as separate elements. You can see the backing fabric here as well, a delicious mottled teal.

      GoldenLight_Quilting1

      GoldenLight_GoldThread

      Then I quilted the darker section in the upper-right quadrant with a brown top thread. I used the diagonal line that runs through the image as the dividing line between the two sections of parallel stitching, and eyeballed the entire quilt from that one line. Teal thread was used for every stitch on the back of the quilt.

      GoldenLight_BrownThread

      I used a variegated thread for the rest of the straight lines. Again, with teal thread on the back.

      GoldenLight_Thread

      Quilting so many straight lines was exhausting, but I found my rhythm after a while.

      GoldenLight_Lines

      You can really see how the variegated thread looks on the binding. I used three lines of it on the binding to make this element really stand out.

      GoldenLight_Binding

      I had to go back to the fabric store to find a suitable edge fabric to use for the binding, but then I found a bias tape that was the perfect colour/color, so I used that.

      GoldenLight_FabricStore

      And here's what the back looks like.

      GoldenLight_Back

      And here it is on the bed in one of the guest rooms ... I go in and visit it often, as I take stretch breaks from work-work and think about which image I'll quilt next ...

      GoldenLight_OnBed

      I only had one major blooper that I had to reëngineer with this quilt. Oh, and what a drag it was! My camera apparently couldn't register so much teal, so it displayed it as greyish/grayish, but even so, there's the epic snaggle of thread. Ugh.

      GoldenLight_Snarl

      And here are the test scraps that I used during this project. I can't imagine throwing them out unless I have a record of what they look like. It's that, or staple them into my diary, and that gets cumbersome.

      GoldenLight_Scraps