Vous pouvez lire le billet sur le blog La Minute pour plus d'informations sur les RSS !
Feeds
11861 items (35 unread) in 55 feeds

-
Décryptagéo, l'information géographique
-
Cybergeo (10 unread)
-
Revue Internationale de Géomatique (RIG)
-
SIGMAG & SIGTV.FR - Un autre regard sur la géomatique
-
Mappemonde (10 unread)
-
Imagerie Géospatiale
-
Toute l’actualité des Geoservices de l'IGN
-
arcOrama, un blog sur les SIG, ceux d ESRI en particulier
-
arcOpole - Actualités du Programme
-
Géoclip, le générateur d'observatoires cartographiques
-
Blog GEOCONCEPT FR

-
Géoblogs (GeoRezo.net)
-
Conseil national de l'information géolocalisée
-
Geotribu
-
Les cafés géographiques (2 unread)
-
UrbaLine (le blog d'Aline sur l'urba, la géomatique, et l'habitat)
-
Icem7 (10 unread)
-
Séries temporelles (CESBIO)
-
Datafoncier, données pour les territoires (Cerema)
-
Cartes et figures du monde
-
SIGEA: actualités des SIG pour l'enseignement agricole
-
Data and GIS tips
-
Neogeo Technologies
-
ReLucBlog
-
L'Atelier de Cartographie
-
My Geomatic
-
archeomatic (le blog d'un archéologue à l’INRAP)
-
Cartographies numériques (2 unread)
-
Veille cartographie (1 unread)
-
Makina Corpus
-
Oslandia
-
Camptocamp
-
Carnet (neo)cartographique
-
Le blog de Geomatys
-
GEOMATIQUE
-
Geomatick
-
CartONG (actualités)
James Fee GIS Blog
-
0:50
Moving…
sur James Fee GIS BlogI’m not saying I’ll never post here again, but I think this blog has run its course. Follow me at [jamesfee.org] . Part of this is, what do I do about twitter after some nutty billionaire ruins it and partly about this conversation.
I am excited about the potential return of RSS and blogs.
— Andrew Turner (@ajturner@nullisland.social) (@ajturner) April 25, 2022
The worst case is everyone moves to “newsletters” – dark archives that aren’t durable and discoverableUpdate your RSS: [https:]]
Subscribe via weekly email: [https:]]
Follow on micro.blog: [https:]]
-
22:59
A GIS Degree
sur James Fee GIS BlogMy son decided to change majors from biodesign to GIS. I had a short moment when I almost told him not to bring all this on himself but then thought differently. I could use my years of experience to help him get the perfect degree in GIS and get a great job and still do what he wants.
He’s one semester into the program so he really hasn’t taken too many classes. There has been the typical Esri, SPSS and Google Maps discussion, but nothing getting into the weeds. Plus he’s taking Geography courses as well so he’s got that going for him. Since he’s at Arizona State University, he’s going through the same program as I did, but it’s a bit different. When I was at ASU, Planning was in the Architectural College. Now it’s tied with Geography in a new School of Geographical Sciences & Urban Planning.
I have to be honest, this is smart, I started my GIS career working for a planning department at a large city. The other thing I noticed is a ton of my professors are still teaching. I mean how awesome is that? I suddenly don’t feel so old anymore.
I’ve stayed out of his classes for the past semester in hopes that he can form his own thoughts on GIS and its applicability. I probably will continue to help him focus on where to spend his electives (more Computer Science and less History of the German Empire 1894-1910). He’s such a smart kid, I know he’s going to do a great job and he was one who spent time in that Esri UC Kids Fair back when I used to go to the User Conference. Now he could be getting paid to use Esri software or whatever tool best accomplishes his goals.
I plan to show him the Safe FME Minecraft Reader/Writer.
-
19:15
GIS and Monitors
sur James Fee GIS BlogIf there is one constant in my GIS career, it is my interest in the monitor I’m using. Since the days of being happy for a “flat screen” Trinitron monitor to now with curved flat screens, so much has changed. My first GIS Analyst position probably had the worst monitor in the history of monitors. I can’t recall the name but it had a refresh rate that was probably comparable what was seen in the 1960s. It didn’t have great color balance either, so I ended up printing out a color swatch pattern from ArcInfo and taped it on my wall so I could know what color was what.
I stared for years at this monitor. No wonder I need reading glasses now!
Eventually I moved up in the world where I no longer got hand-me-down hardware and I started to get my first new equipment. The company I worked for at the time shifted between Dell and HP for hardware, but generally it was dual 21″ Trinitron CRTs. For those who are too young to remember, they were the size of a small car and put off enough heat and radiation to probably shorten my life by 10 year. Yet, I could finally count on them being color corrected by hardware/software and not feel like I was color blind.
It wasn’t sexy but it had a cool look to it. You could drop it flat to write on it like a table.
Over 11 years ago, I was given a Wacom DTU-2231 to test. You can read more about it on that link but it was quite the monitor. I guess the biggest change between now and then is how little that technology took off. I guess if you asked me right after you read that post in 2010 what we’d be using in 2020, I would have said such technology would be everywhere. Yet we don’t see stylus based monitor much at all.
These days my primary monitor is a LG UltraFine 24″ 4k. I pair it with another 24″ 4K monitor that I’ve had for years. Off to the other side is a generic Dell 24″ monitor my company provided. I find this setup works well for me, gone are the days where I had ArcCatalog and ArcMap open in two different monitors. Alas two of the monitors are devoted to Outlook and WebEx Teams, just a sign of my current work load.
I’ve always felt that GIS people care more about monitors than most. A developer might be more interested in a Spotify plugin for their IDE, but a GIS Analyst care most about the biggest, brightest and crispest monitor they can get their hands on. I don’t always use FME Workbench these days, but when I do, it is full screen on the most beautiful monitor I can have. Seems perfect to me.
-
17:00
Are Conferences Important Anymore?
sur James Fee GIS BlogHey SOTM is going on, didn’t even know. The last SOTM I went to was in 2013 which was a blast. But I have to be honest, not only did this slip my mind, none of my feeds highlighted it to me. Not only that, apparently Esri is having a conference soon. (wait for me to go ask Google when it is) OK, they are having it next week. I used to be the person who went to as much as I could, either through attending or invited to keynote. The last Esri UC I went to was in 2015, 6 years ago. As I said SOTM was in 2013. FOSS4G, 2011. I had to look up, the last conference that had any GIS in it was the 2018 Barcelona Smart City Expo.
So with the world opening back up, or maybe not given whatever greek letter variant we are dealing with right now, I’ve started to think about what I might want to attend and the subject matter. At the end of the day, I feel like I got more value out of the conversations outside the convention center than inside. So probably where I see a good subset of smart people hanging out. That’s why those old GeoWeb conferences that Ron Lake put on were so amazing. Meeting a ton of smart people and enjoying the conversations, rather than reading Powerpoint slides in a darkly lit room.
Hopefully we can get back to that, just need to keep my eye out.
-
17:57
Unreal and Unity are the new Browsers
sur James Fee GIS BlogSomeone asked me why I hadn’t commented on Cesium and Unreal getting together. Honestly , no reason. This is big news honestly. HERE, where I work, is teaming up with Unity to bring the Unity SDK and the HERE SDK to automotive applications. I talk about how we used Mapbox Unity SDK at Cityzenith (though I have no clue if they still are). Google and Esri have them too. In fact both Unreal and Unity marketplaces are littered with data sources you can plug in.
HERE Maps with Unity
This is getting at the core of what these two platforms could be. Back in the day we had two browsers, Firefox and Internet Explorer 6. Inside each we had many choices of mapping platforms to use. From Google and Bing to Mapquest and Esri. In the end that competition to make the best API/SDK for a mapping environment drove a ton of innovation. What Google Maps looks like and does in 2021 vs 2005 is amazing.
This brings up the key as to what I see happening here. We’ll see the mapping companies (or companies that have mapping APIs) deliver key updates to these SDK (which today are pretty limited in scope) because they have to stay relevant. Not that web mapping is going away at any point, but true 3D world and true Digital Twins require power that browsers cannot provide even in 2021. So this rush to become the Google Maps of 3D engines is real and will be fun to watch.
Interesting in that Google is an also-ran in the 3D engine space, so there is so much opportunity for the players who have invested and continue to invest in these markets without Google throwing unlimited R&D dollars against it. Of course it only takes on press release to change all that so don’t bet against Google.
-
13:00
Arrays in GeoJSON
sur James Fee GIS BlogSo my last post was very positive. I figured out how to relate the teams that share a stadium with the stadium itself. This was important because I wanted to eliminate the redundant points that were on top of each other. For those who don’t recall, I have an example in this gist:
Now I mentioned that there were issues displaying this in GIS applications and was promptly told I was doing this incorrectly:
An array of <any data type> is not the same as a JSON object consisting of an array of JSON objects. If it would have been the first, I'd have pointed you (again) to QGIS and this widget trick [https:]] .
— Stefan Keller (@sfkeller) April 4, 2021If you click on that tweet you’ll see basically that you can’t do it the way I want and I have to go back to the way I was doing it before:
Unfortunately, the beat way is to denormalise. Redundant location in many team points.
— Alex Leith (@alexgleith) April 4, 2021I had a conversation with Bill Dollins about it and he sums it up susinctly:
I get it, but “Do it this way because that’s what the software can handle” is an unsatisfying answer.
So I’m stuck, I honestly don’t care if QGIS can read the data, because it can. It just isn’t optimal. What I do care about is an organized dataset in GeoJSON. So my question that I can’t get a definitive answer, “is the array I have above valid GeoJSON code?”. From what I’ve seen, yes. But nobody wants to go on record as saying absolutely. I could say, hell with it I’m moving forward but I don’t want to go down a dead end road.
-
21:42
GeoJSON Ballparks as JSON
sur James Fee GIS BlogIn a way it is good that Sean Gillies doesn’t follow me anymore. Because I can hear his voice in my head as I was trying to do something really stupid with the project. But Sheldon helps frame what I should be doing with what I was doing:
tables? what the? add , teams:[{name:"the name", otherprop: …}, {name:…}] to each item in the ballparks array and get that relational db BS out of your brain
— Sheldon (@tooshel) April 2, 2021Exactly! What the hell? Why was I trying to do something so stupid when the while point of this project is baseball ballparks in GeoJSON. Here is the problem in a nutshell and how I solved it. First off, let us simply the problem down to just one ballpark. Salt River Fields at Talking Stick is the Spring Training facility for both the Arizona Diamondbacks and the Colorado Rockies. Not only that, but there are Fall League and Rookie League teams playing there. Probably even more that I still haven’t researched. Anyway, GeoJSON Ballparks looks like this today when you just want to see that one stadium.
Let’s just say I backed myself in this corner by starting by only having MLB ballparks, none of which at the time of the project were shared between teams.
It’s a mess right? Overlapping points, so many opportunities to screw up names. So my old school thought was just create a one-to-many relationship between the GeoJSON points and some external table. Madness! Seriously, what was I thinking? Sheldon is right, I should be doing a JSON array for the teams. Look how much nicer it all looks when I do this!
Look how nice that all is? So easy to read and it keeps the focus on the ballparks.
As I said in the earlier blog post.
The problem now is so many teams, especially in spring training, minor leagues and fall ball, share stadiums, that in GeoJSON-Ballparks, you end up with multiple dots on top of each other. No one-to-many relationship that should happen.”
The project had pivoted in a way I hadn’t anticipated back in 2014 and it was a sure a mess to maintain. So now I can focus on fixing the project with the Minor League Baseball realignment that went on this year and get an updated dataset in Github very soon.
One outcome of doing this nested array is that many GIS tools don’t understand how to display the data. Take a look at geojson.io:
geojson.io compresses the array into one big JSON-formatted string. QGIS and Github do this also. It’s an issue that I’m willing to live with. Bill Dollins shared the GeoJSON spec with me to prove the way I’m doing is correct:
3.2. Feature Object A Feature object represents a spatially bounded thing. Every Feature object is a GeoJSON object no matter where it occurs in a GeoJSON text. o A Feature object has a "type" member with the value "Feature". o A Feature object has a member with the name "geometry". The value of the geometry member SHALL be either a Geometry object as defined above or, in the case that the Feature is unlocated, a JSON null value. o A Feature object has a member with the name "properties". The value of the properties member is an object (any JSON object or a JSON null value).
ANY JSON OBJECT! So formatting the files this way is correct and the way it should be done. I’m going to push forward on cleaning up GeoJSON Ballparks and let the GIS tools try and catch up.
-
22:15
GeoJSON Ballparks and MLB Minor League Realignment
sur James Fee GIS Blog**UPDATE** – See the plan.
Boy, where to start? First, for those who haven’t been following, this happened over the winter.
Major League Baseball announced on Friday (February 12, 2021) a new plan for affiliated baseball, with 120 Minor League clubs officially agreeing to join the new Professional Development League (PDL). A full list of Major League teams and their new affiliates, one for each level of full-season ball, along with a complex league (Gulf Coast and Arizona) team, can be found below.
Minor League Baseball
What does that mean? Well for GeoJSON Ballparks basically every minor league team is having a modification to it. At a minimum, the old minor league names have changed. Take the Pacific Coast League that existed for over 118 years is now part of Triple-A West which couldn’t be a more boring name. All up and down the minor leagues, the names now just reflect the level of minor league the teams are. And some teams have moved from AAA to Single A and all around.
I usually wait until Spring Training is just about over to update the minor league teams but this year it almost makes zero sense. I’ve sort of backed myself into a spatial problem, unintended when I started. Basically, the project initially was just MLB teams and their ballparks. The key to that is that the teams drove the dataset, not the ballparks even though the title of the project clearly said it was. As long as nobody shared a ballpark, this worked out great. The problem now is so many teams, especially in spring training, minor leagues and fall ball, share stadiums, that in GeoJSON-Ballparks, you end up with multiple dots on top of each other. No one-to-many relationship that should happen.
So, I’m going to use this minor league realignment to fix what I should have fixed years ago. There will be two files in this dataset moving forward. One GeoJSON file of the locations of a ballpark and then a CSV (or other format) file containing the teams. Then we’ll just do the old fashioned relate between the two and the world is better again.
I’m going to fork GeoJSON-Ballparks into a new project and right the wrongs I have done against good spatial data management. I’m finally ready to play centerfield!
-
17:46
I’m Here at HERE
sur James Fee GIS BlogEnclosure: [download]
Last Tuesday I started at HERE Technologies with the Professional Services group in the Americas. I’ve probably used HERE and their legacy companies data and services for most of my career so this is a really cool opportunity to work with a mobile data company.
I’m really excited about working with some of their latest data products including Premier 3D Cities (I can’t escape Digital Twins).
Digital Twins at HERE -
18:36
Digital Twins and Unreal Engine
sur James Fee GIS BlogI’ve had a ton of experience with Unity and Digital Twins but I have been paying attention to Unreal Engine. I think the open nature of Unity is probably more suited for the current Digital Twin market, but competition is so important for innovation. This project where Unreal Engine was used to create a digital clone of Adelaide is striking but the article just leaves me wanting for so much more.
A huge city environment results in a hefty 3D model. Having strategies in place to ease the load on your workstation is essential. “Twinmotion does not currently support dynamic loading of the level of detail, so in the case of Adelaide, we used high-resolution 3D model tiles over the CBD and merged them together,” says Marre. “We then merged a ring of low-resolution tiles around the CBD and used the lower level of detail tiles the further away we are from the CBD.”
Well, that’s how we did it at Cityzenith. Tiles are the only way to have the detail one needs in these 3D worlds and one that geospatial practitioners are very used to dealing with their slippy maps. The eye-candy that one sees in that Adelaide project is amazing. Of course, scaling one city out is hard enough but doing so across a country or the globe is another. Still, this is an amazing start.
Seeing Epic take Twinmotion and scale it out this way is very exciting because as you can see from that video above, it really does look photorealistic.
But this gets at the core of where Digital Twins have failed. It is so very easy to do the above, crate an amazing looking model of a city, and drape imagery across it. It is a very different beast to actually create a Digital Twin where these buildings are not only linked up to external IoT devices and services but they should import BIM models and generalize as needed. They do so some rudimentary analysis of shadows which is somewhat interesting, but this kind of stuff is so easy to do and there are so many tools to do it that all this effort to create a photorealistic city seems wasted.
I think users would trade photorealistic cities for detailed IoT services integration but I will watch Aerometrex continue to develop this out. Digital Twins are still stuck in sharing videos on Vimeo and YouTube, trying to create some amazing realistic city when all people want is visualization and analysis of IoT data. That said, Aerometrex has done an amazing job building this view.
-
21:01
Moving Towards a Digital Twin Ecosystem
sur James Fee GIS BlogSmart Cities really start to become valuable when they integrate with Digital Twins. Smart Cities do really well with transportation networks and adjusting when things happen. Take, for example, construction on an important Interstate highway that connects the city core with the suburbs causes backups and a smart city can adjust traffic lights, rail, and other modes of transportation to help adjudicate the problems. This works really well because the transportation system talk to each other and decisions can be made to refocus commutes toward other modes of transportation or other routes. But unfortunately, Digital Twins don’t do a great job talking to Smart Cities.
Photo by Victor Garcia on Unsplash
A few months ago I talked about Digital Twins and messaging. The idea that:
Digital twins require connectivity to work. A digital twin without messaging is just a hollow shell, it might as well be a PDF or a JPG. But connecting all the infrastructure of the real world up to a digital twin replicates the real world in a virtual environment. Networks collect data and store it in databases all over the place, sometimes these are SQL-based such as Postgres or Oracle, and other times they are simple as SQLite or flat-file text files. But data should be treated as messages back and forth between clients.
This was in the context of a Digital Twin talking to services that might not be hardware-based, but the idea stands up for how and why a Digital Twin should be messaging the Smart City at large. Whatever benefits a Digital Twin gains from an ecosystem that collects and analyzes data for decision-making those benefits become multiplied when those systems connect to other Digital Twins. But think outside a group of Digital Twins and the benefit of the Smart City when all these buildings are talking to each other and the city to make better decisions about energy use, transportation, and other shared infrastructure across the city or even the region (where multiple Smart Cities talk to each other).
When all these buildings talk to each other, they can help a city plan, grow and evolve into a clean city.
What we don’t have is a common data environment (CDE) that cities can use. We have seen data sharing on a small scale in developments but not on a city-wide or regional scale. To do this we need to agree on model standards that allow not only Digital Twins to talk to each other (Something open like Bentley’s iTwin.js) and share ontologies. Then we need that Smart City CDE where data is shared, stored, and analyzed at a large scale.
One great outcome of this CDE is all this data can be combined with City ordinances to give tools like Delve from Sidewalk Labs even more data to create their generative design options. Buildings are not a bubble in a city and their impacts on the city extend out beyond the boundaries of the parcel they are built on. That’s what so exciting about this opportunity, manage assets in a Digital Twin on a micro-scale, but share generalized data about those decisions to the city at large which then can share them with other Digital Twins.
And lastly, individual Smart Cities aren’t bubbles either. They have huge impacts on the region or even the country that they are in. If we can figure out how to create a national CDE, one that covers a country as diverse as the United States, we can have something that can even benefit the world at large. Clean cities are the future and thinking about them on a small scale will only result in the gentrification of affluent areas and leave less well areas behind. I don’t want my children to grow up in a world like that and we have the processes in place to ensure that they have a better place than use to grow up in.
-
23:46
Apple’s Digital Twin is All About Augmented Reality
sur James Fee GIS BlogNow before we get too far, Apple has not created anything close to a Digital Twin as we know them. But what they have done is created an easy way to import your building models into Apple Maps. Apple calls this their Indoor Maps program.
Easily create detailed maps of your indoor spaces and let visitors see where they are right in your app. Organizations with large public and private spaces like airports, shopping centers, arenas, hospitals, universities, and private office buildings can register for the Indoor Maps Program. Indoor maps are built using industry standard tools and require only your existing Wi-Fi network to enable GPS-level location accuracy so visitors can navigate your spaces with ease.
Victoria Airport in the Apple IMDF Sandbox
OK, so clearly this is all about navigation. How do I know where I am in a building and how do I get to a place I need to be. Of course, this is somewhat interesting on your iPhone or iPad in Apple Maps, but clearly, there is more to this than just how do I find the restroom on floor 10 of the bank tower.
To load your buildings in Apple you need to use Mapkit or Mapkit.js and convert your buildings into Indoor Mapping Data Format (IMDF). IMDF is actually a great choice because it is GeoJSON and working toward being an OGC standard (for whatever that is worth these days). I did find it interesting that Apple highlights the following as the use case for IMDF:
- Indoor wayfinding
- Indoor routing
- Temporal constraints
- Connectivity amongst mapped objects
- Location-based services
- Query and find by location functionality
If you’re familiar with GeoJSON, IMDF will look logical to you:
{ "id": "11111111-1111-1111-1111-111111111111", "type": "Feature", "feature_type": "building", "geometry": null, "properties": { "category": "parking", "restriction": "employeesonly", "name": { "en": "Parking Garage 1" }, "alt_name": null, "display_point": { "type": "Point", "coordinates": [1.0, 2.0] }, "address_id": "2222222-2222-2222-2222-222222222222" } }
I encourage you to review the IMDF docs to learn more but we’re talking JSON here so it’s exactly how you’d expect it to work.
Because IMDF buildings are generalized representations of the real-world data, this isn’t actually a Digital Twin. It also means that you need to do some things to your files before converting them to IMDF. Autodesk, Esri, and Safe Software all support IMDF so you should be able to use their tools to handle the conversions. I’ve done the conversion with Safe FME and it works very well and probably the best way to handle this. In fact, Safe has an IMDF validator which does come in handy for sure.
Safe FME support of IMDF
What does make moving your buildings to Apple’s Indoor platform is the new iPhone 12 and iPad Pro LiDAR support. This brings out some really great AR capabilities that become enabled with Apple’s devices. As I said last week, the LiDAR support in the current devices is more about getting experience with LiDAR use cases than actual LiDAR use. This is all about eventual Apple Glass (and Google Glass too) support and the AR navigation that can be done when you have hyper-accurate indoor models in your mapping software.
I’ve been dusting off my MapKit skills because I think not only is this capability useful for many companies but it really isn’t that hard to enable. I am spending some time thinking about how to use the extension capability of IMDF to see how IoT and other services can be brought in. Given the generalized nature of IMDF, it could be a great way to allow visualizing IoT and other services without the features of a building getting in the way. Stay tuned!
-
19:21
COVID-19 is Showing How Smart Cities Protect Citizens
sur James Fee GIS BlogI feel like there is a before COVID and an after COVID with citizens’ feelings for Smart City technology. Now there is an election tomorrow in the United States that will probably dictate how this all moves forward and after 2016, I’ve learned to not predict anything when it comes to the current president. But, outside that huge elephant in the background, Smart City concepts have been thrust into the spotlight.
Photo by Michael Walter on Unsplash
Most cities have sent their non-essential workers home, so IoT and other feeds to their work dashboards have become critical to their success. The data collection and analysis of the pulse of a city is now so important that traditional field collection tools have become outdated.
Even how cities engage with their citizens has changed. Before COVID, here in Scottsdale, you needed to head to a library to get a library card in person. But since COVID restrictions, the city has allowed library card applications in person which is a huge change. The core structure of city digital infrastructure has to change to manage this new need. Not only engaging citizens deeper with technology but need to ensure those who don’t have access to the internet or even a computer are represented. I’ve seen much better smartphone access on websites over the summer and this will continue.
Even moving from a public space to a digital space for city council meetings has implications. The physicality of citizens before their elected leaders is a check on their power, but being a small zoom box in a monitor of zoom boxes puts citizens in a corner. Much will have to be developed to have a way for those who don’t wish to be in person be represented as well as those who choose to attend meetings in person.
COVID has also broken down barriers to sharing data. The imagined dashboard where Police, Fire, Parks & Rec, City Council, and other stakeholders have come to fruition. The single pane of glass where decision-makers can get together to run the city remotely is only going to improve now that the value has been shown.
Lastly, ignoring the possible election tomorrow, contact tracing, and other methods of monitoring citizens as they go around the city has changed mostly how people feel. Before COVID, the idea that a city could track them even anonymously scared the daylights out of people. But today we are starting to see the value in anonymous tracking so that not only we see who has been in contact with each other but how they interact in a city with social distancing restrictions.
Future planning of cities is changing and accelerated because of COVID. The outcome of this pandemic will result in cities that are more resilient, better managed, planned for social distancing, and are working toward carbon neutral environments. In the despair of this unprecedented pandemic, we see humanity coming together to create a better future for our cities and our planet.
-
21:57
The iPhone 12 Pro LiDAR Scanner is the Gateway to AR, But Not in the Way You Think
sur James Fee GIS BlogI’m sure everyone knows about it by now, the iPhone 12 Pro has a LiDAR scanner. Apple touts it to help you take better pictures in low light and do some rudimentary AR on the iPhone. But, what this scanner does today isn’t where the power will be tomorrow.
Apple cares a ton about photo quality, so a LiDAR scanner helps immensely with taking these pictures. If there is one reason today to have that scanner, it is for pictures. But the real power of the scanner is for AR. And AR isn’t ready today, no matter how many demos you see in Apple’s event. Holding up an iPhone and seeing how big a couch in your room is interesting, just as interesting as using your phone to find the nearest Starbucks.
Apple has spent a lot of time working on interior spaces in Apple Maps. They’ve also spent a ton of time working on sensors in the phone for positioning inside buildings. This is all building to an AR navigation space inside public buildings and private buildings in which owners share their 3D plans. But what if hundreds of millions of mobile devices could create these 3D worlds automatically as they go about their business helping users find that Starbucks?
The future is so bright though with this scanner. It helps Apple and developers get familiar with what LiDAR can do for AR applications. This is critically important on the hardware side because Apple Glass, no matter how little is known about it, is the future for AR. Same with Google Glass too, the eventual consumer product (ignoring the junk that the first Google Glass was) of these wearable AR devices will change the world, not so much in that you’ll see an arrow as you navigate to the Starbucks, but give you the insight into smart buildings and all the IoT devices that are around.
The inevitable outcome is in the maintenance of smart buildings
Digital Twins are valuable when they link data feeds to a 3D world that can be interrogated. But the real value comes when those 3D worlds can be leveraged using Augmented Reality to give owners, maintenance workers, planners, engineers, and tenants the information they need to service their buildings and improve the quality of building maintenance. The best built LEED building is only as good as the ongoing maintenance put on it.
The iPhone 12 Pro and the iPad Pro that Apple has released this year both have LiDAR to improve their use with photo taking and rudimentary AR, but the experience gained seeing the real-world use of consumer LiDAR in millions of devices will bring great strides to making these Apple/Google Glass devices truly usable in real-world use. I’m still waiting to get my iPhone 12, but my wife’s arrived today. I’m looking forward to seeing what the LiDAR can do.
-
22:59
Google AI Project Recreating Historical Streetscapes in 3D
sur James Fee GIS BlogWhen this caught my eye I got really interested. Google AI is launching a website titled r? which reconstructs cities from historical maps and photos. You might have seen the underlying tool last month but this productizes it a bit. What I find compelling about this effort is the output is a 3D city that you can navigate and review by going in back in time to see what a particular area looked like in the past.
Of course, Scottsdale, my town, is not worth attempting this on, but older cities that have seen a ton of change will give some great inside into how neighborhoods have changed over the past century.
Street level view of 3D-reconstructed Chelsea, Manhattan
Just take a look at the image above, it really does give the feel of New York back in the ’40s and earlier. People remember how a neighborhood looked, but recreating it in this method gives others key insights into how development has changed how certain areas of cities look and act.
This tool is probably more aimed at history professors and community activists, but as we grow cities into smarter, cleaner places to live, understanding the past is how we can hope to create a better future. I’d love to see these tools be incorporated into smart city planning efforts. The great part of all this is it is crowdsourced, open-sourced, and worth doing. I’m starting to take a deeper dive into the GitHub repository and look how the output of this project can help plan better cities.
-
23:39
Developing a Method to Discover Assets Inside Digital Twins
sur James Fee GIS BlogOn Monday I had a bit of a tweetstorm to get some thoughts on paper.
Thinking about addressing but inside buildings.
— James Fee (@jamesmfee) October 26, 2020In there I laid out what I thought addressing inside a building should look like. A couple of responses came to the “why” or “this isn’t an issue” but the important thing here is with smart buildings, they need to be able to route people not only to offices for “business” but workers to IoT devices to act upon issues that might occur (like a water valve leaking in a utility closet). Sure one, could just pull out an as-built drawing and navigate, or in the case of visiting a company, the guard at the front door, but if things such as Apple Glass and Google Glass start becoming a real thing, we’ll need a true addressing system to get people where they need to be.
Apple and Google are working this out themselves inside their ecosystems but there needs to be an open standard that people can use inside their applications to share data. I mentioned Placekey as a good starting point with their what@where.
The what is an address – poi encoding and the where is based on Uber’s H3 system. As great as all this is, it doesn’t help us figure out where the leaky valve is in the utility closet. This all is much better than other systems and is a great way to get close. I’ve not seen any way to create extensions to Placekey to do this but we’ll punt the linking problem for now.
The other problem with addressing inside a building is the digital twin might not be in any projection that our maps understand. So we’ll need to create a custom grid to figure out where the IoT and other interesting features are located. But there seems to be a standard being created that solves just this problem, UBID.
UBID builds on the open-source grid reference system and is essentially the north axis-aligned “bounding box” of the building’s footprint represented as a centroid along with four cardinal extents.
I really like this, it might even compete with Placekey, but that’s not my battle, I’m more concerned with buildings in this use case. There is so much to UBID to digest and I encourage you to read the Github to learn more.
But if we can link these grids of buildings, with a Placekey, we have a superb method of navigating to a building POI and then drilling down into navigating to that location using all the great work that companies like Pixel8 are doing. But all that navigation stuff is not my battle, just a location of an IoT sensor in a digital twin that may or may not be in a project we can use.
Working toward that link, a unique grid of a digital twin to a Placekey would solve all problems with figuring out where an asset inside a building is and what is going on at that location. The ontologies to link this could open up whole new methods of interrogation of IoT devices and so much more. e911 and similar systems could greatly benefit from this as well.
-
19:10
Machine Learning Smart City Development from Sidewalk Labs
sur James Fee GIS BlogThe last time most people heard from Sidewalk Labs was when Toronto didn’t go forward with their Smart City project. There are a ton of reasons why that didn’t happy, but moonshots are what they are and even if you don’t reach the moon, outcomes can be really good for society. Of course, I know not what Sidewalk Labs has been working on but I have to assume Delve exists because of the work they are doing to build smart cities.
Delve at its most simple description is where computers figure out the best design options for commercial or residential project development. there is much more going on here and that’s where the Machine Learning (ML) part comes in and what really catches my eye. I’ve done a tone of work with planning in my years of working with AECs and coming up with multiple design options is time-consuming and very difficult. But with Delve, this can happen quickly and repeatable in minutes.
A quick look at Delve
You get optimal design options based on ranked priority outcomes such as cost or view. Delve takes inputs such as zoning constraints (how high a building can be or what the setbacks are), gross floor area (commercial or residential), and then combines these with the priority outcomes. Then you get scored options that you can look further into and continue to make changes to the inputs.
The immediacy of this is what really sets this apart. When I was at Cityzenith years ago, we attempted to try and get this worked out but the ML tools were not developed enough yet. Clearly though, with Alphabet backing, Sidewalk Labs has created an amazing tool that will probably change how cities are being developed.
I am really excited to see how this works out. I don’t see an API yet so integration outside of Sidewalk labs does not seem to be a priority at this point but the outcome for scaleable planning like this needs to have an API. I’ll be paying attention but seeing ML being used for this type of development is logical, understandable, and workable. We should see great success. You can read more at the Sidewalk Labs blog.
-
19:53
Capturing As-Built Changes to Make Better Digital Twins
sur James Fee GIS BlogThis post originally appeared on LinkedIn.
Augmented Reality view of Apple Park
Digital Twins are easy. All you have to do is create a 3D object. Some triangles and you’re done. A BIM model is practically a Digital Twin. The problem is usually those twins are created from data that isn’t “as-built“. What you end up with is a digital object that ISN’T a twin. How can you connect your IoT and other assets to a 3D object that isn’t representative of the real world?
I talked a little bit last time on how to programmatically create digital twins from satellite and other imagery. Of course, a good constellation can make these twins very up to date and accurate but it can miss the details needed for a good twin and it sure as heck can’t get inside a building to update any changes there. What we’re looking for here is a feedback loop, from design to construction to digital twin.
There are a lot of companies that can help with this process so I won’t go into detail there, but what is needed is the acknowledgment that investment is needed to make sure those digital twins are updated, not only is the building being delivered but an accurate BIM model that can be used as a digital twin. Construction firms usually don’t get the money to update these BIM models so they are used as a reference at the beginning, but change orders rarely get pushed back to the original BIM models provided by the architects. That said there are many methods that can be used to close this loop.
Construction methods cause changes from the architectural plans
Companies such as Pixel8 that I talked about last week can use high-resolution imagery and drones to create a point cloud that can be used to verify not only changes are being made as specifications but also can notify where deviations have been made from the BIM model. This is big because humans can only see so much on a building, and with a large model, it is virtually impossible for people to detect change. But using machine learning and point clouds, change detection is actually very simple and can highlight where accepted modifications have been made to the architectural drawings or where things have gone wrong.
Focus on getting those changes into the original BIM models helps your digital twinsThe key point here is using ML to discover and update digital twins at scale is critically important, but just as important is the ability to use ML to discover and update digital twins as they are built, rather than something that came from paper space.
Credits:
Photo by Patrick Schneider on Unsplash
Photo by Elmarie van Rooyen on Unsplash
Photo by Scott Blake on Unsplash -
0:48
Scaling Digital Twins
sur James Fee GIS BlogThis article originally appeared on LinkedIn.
Let’s face it, digital twins make sense and there is no arguing their purpose. At least with the urban landscape though, it is very difficult to scale digital twins beyond a section of a city. At Cityzenith we attempted to overcome this need to have 3D buildings all over the world and used a 3rd party service that basically extruded OSM building footprints where they existed. You see this in the Microsoft Flight Simulator worlds that people have been sharing, it looks pretty good from a distance but up close it becomes clear that building footprints are a horrible way to represent a digital twin of the built environment because they are so inaccurate. Every building is not a rectangle and it becomes impossible to perform any analysis on them because they can be off upwards of 300% on their real-world structure.
Microsoft Flights Simulator created world-wide digital twins at a very rough scale.
How do you scale this problem, creating 3D buildings for EVERYWHERE? Even Google was unable to do this, they tried to get people to create accurate 3D buildings with Sketchup but that failed, and they tossed the product over to Trimble where it has gotten back to its roots in the AEC space. If Google can’t do it who can?
Vricon, who was a JV between Maxar and Saab but recently absorbed by Maxar completely, gives a picture into how this can be done. Being able to identify buildings, extract their shape, drape imagery over them, and then continue to monitor change over the years as additions, renovations, and even rooftop changes are identified. There is no other way I can see that we can have worldwide digital twins other than using satellite imagery.
Vricon is uniquely positioned to create on demand Digital Twins world-wide.
Companies such as Pixel8 also play a part in this. I’ve already talked about how this can be accomplished on my blog; I encourage you to take a quick read on it. The combination of satellite digital twins to cover the world and then using products such as Pixel8 can create that highly detailed ground truth that is needed in urban areas. In the end, you get an up to date, highly accurate 3D model that actually allows detailed analysis of impacts from new buildings or other disruptive changes in cities.
Hyper-accurate point clouds from imagery, hand-held or via drone.
But to scale out a complete digital twin of the world at scale, the only way to accomplish this is through satellite imagery. Maxar and others are already using ML to find buildings and discover if they have changed over time. Coupled with the technology that Vricon brings inside Maxar, I can see them really jump-starting a service of worldwide digital twins. Imagine being able to bring accurate building models into your analysis or products that not only are hyper-accurate compared to extruded footprints but are updated regularly based on the satellite imagery collected.
That sounds like the perfect world, Digital Twins as a Service.
-
0:15
IoT is not About Hardware
sur James Fee GIS BlogWhen you think about IoT you think about little devices everywhere doing their own thing. From Nest thermostats and Ring doorbells to Honeywell environmental controls and Thales biometrics; you imagine hardware. Sure, there is the “I” part of IoT that conveys some sort of digital aspect, but we imagine the “things” part. But the simple truth of IoT is the hardware is a commodity and the true power in IoT is in the “I” part or the messaging.
IoT messages can inundate us but they are the true power of these sensors
IoT messages are usually [HTTP,] WebSockets, and MQTT or some derivative of them. MQTT is the one that I’m always most interested, but anything works which is what is so great about IoT as a service. MQTT is leveraged greatly by AWS IoT and Azure IoT and both services work so well at messaging that you can use either in replacement of something like RabbitMQ, which my daughter loves because of the rabbit icon. I could write a whole post on MQTT but we’ll leave that for another day.
IoT itself is built upon this messaging. That individual hardware devices have UIDs (unique identifiers) that by their very nature allow them to be unique. The packets of information that are sent back and forth between the device and the host are short messages that require no human interaction.
The best part of this is that you don’t need to hardware for IoT. Everything that you want to interact with should be an IoT message, no matter if it is an email, data query or text message. Looking at IoT as more than just hardware opens connectivity opportunities that had been much harder in the past.
Digital twins require connectivity to work. A digital twin without messaging is just a hollow shell, it might as well be a PDF or a JPG. But connecting all the infrastructure of the real world up to a digital twin replicates the real world in a virtual environment. Networks collect data and store it in databases all over the place, sometimes these are SQL-based such as Postgres or Oracle and other times they are simple as SQLite or flat file text files. But data should be treated as messages back and forth between clients.
All I see is IoT messages
When I look at the world, I see messaging opportunities, how we connect devices between each other. Seeing the world this way allows new ways to bring in data to Digital Twins, think of GIS services being IoT devices, and much easier ways to get more out of your digital investments.
-
23:14
Open Environments and Digital Twins
sur James Fee GIS BlogThe GIS world has no idea how hard it is to work with data in the digital twin/BIM world. Most GIS formats are open, or at works readable to import into a closed system. But in the digital twin/BIM space, there is too many close data sets that makes it so hard to work with the data. The loops one must go through to import a Revit model are legendary and mostly are how you get your data into IFC without giving up all the intelligence. At Cityzenith, we were able to work with tons of open formats, but dealing with Revit and other closed formats was very difficult to the point it required a team in India to handle the conversions.
All the above is maddening because if there is one thing a digital twin should do, is be able to talk with as many other systems as possible. IoT messages, GIS datasets, APIs galore and good old fashioned CAD systems. That’s why open source data formats are best, those that are understood and can be extended in any way someone needs. One of the biggest formats that we worked with was glTF. It is widely supported these days but it really isn’t a great format for BIM models or other digital twin layers because it is more of a visual format than a data storage model. Think of it similar to a JPEG, great for final products, but you don’t want to work with it for your production data.
IFC, which I mentioned before, is basically a open BIM standard. IFC is actually a great format for BIM, but companies such as Autodesk don’t do a great job supporting it, it becomes more of interchange file, except where governments require it’s use. I also dislike the format because it is unwieldy, but it does a great job of interoperability and is well supported by many platforms.
IFC and GLTF are great, but they harken back to older format structures. They don’t take advantage of modern cloud based systems. I’ve been looking at DTDL (Digital Twins Definition Language) from Microsoft. What I do like about DLDT is that it is based on JSON-LD so many of those IoT services you are already working with take advantage of it. Microsoft’s Digital Twin platform was slow to take off but many companies, including Bentley Systems, are leveraging it to help their customers get a cloud based open platform which is what they all want. Plus you can use services such as Azure Functions (very underrated service IMO) to work with your data once it is in there.
Azure Digital Twins
The magic of digital twins is when you can connect messaging (IoT) services to your digital models. That’s the holy grail, have the real world connected to the digital world. Sadly, most BIM and digital twin systems aren’t open enough and require custom conversion work or custom coding to enable even simple integration with SAP, Salesforce or MAXIMO. That’s why these newer formats, based mostly on JSON, seem to fit the bill and we will see exponential growth in their use.
The post Open Environments and Digital Twins appeared first on Spatially Adjusted.
-
19:39
Studio is the New Pro
sur James Fee GIS BlogFor quite some time, appending “Pro” after a product was a great way to highlight the new hotness of a product. Believe me, I’m guilty as charged! But the new product name is “Studio”.
- Mapbox Studio
- Google Earth Studio (not to mention Data Studio)
- Azure Data Studio
- ArcGIS AppStudio
- Oracle Developer Studio
- Visual Studio
I mean I could go on, but Studio seems to be the new method of naming authoring tools. I’m not here to make fun of this, just state that I love the name studio used in this sense. Having worked with Architects most of my life, Studio has a great connotation for workspace. All those apps above are used as a workspace to create something else. The concept of a studio, used this way, really helps define what a product is used for. I think the term, hackspace, has taken on a modern connotation for studio but the core concept is just so sound on this part.
A classic studio
Pro or Professional probably harkens back to the early days of software and hardware, where you’d create consumer and professional products. These days, professional is usually used to show a higher end product, vs a professional product (or at least used inconsistently). But appending “pro” to an end of a product name doesn’t signify the same purpose as studio does.
One could almost call ArcMap or QGIS a studio since that is what people are doing with it. My personal studio is my office with this computer I’m writing this blog post on. That thought makes me smile.
-
21:10
Spatial Tau Newsletter
sur James Fee GIS BlogHappy Friday everyone. These weeks just fly by when you are locked in your house looking out your front window for the Instacart delivery from the grocery store. I just wanted to remind everyone that I’ve got a weekly newsletter were I do some deep dives into things that are on my mind related to GIS, BIM, Smart Cities and technology.
Just sign up below and you’ll get a newsletter in your inbox each Wednesday (or maybe Thursday LOL).
SpatialTau Newsletter
[https:]] -
22:07
Smart Cities and Digital Twins Will Be Built Using Smart Phone Cameras
sur James Fee GIS BlogI’ve spent years trying to build worldwide building datasets for Smart City and Digital Twin applications. I’ve tried building them using off-the-shelf data providers that give you COLLADA files, I’ve tried using APIs such as the Mapbox Unity SDK and buying buildings one by one to fill in gaps. None of these solutions have the resolution needed to perform the types of analysis needed to make better choices for cities and development potential. How to create real 3D cities with enough resolution has been out of our grasp until now.
I’ve been following Pixel8 for a while now and it is clear that crowdsourcing these models is going to be the only way forward. Over 10 years ago, Microsoft actually had this figured out with their Photosyth tool but they never were able to figure out what to do with it. Only today are we seeing startups attack this problem with a solution that has enough resolution and speed that we can start seeing cities build highly detailed 3D models that have actual value.
Example stolen from Pixel8
It is still early days with these point cloud tools, but at the speed they’ve improved over the last year, we should be seeing their use more and more. Mixing the data from smartphones, lidar and satellite imagery can make large areas of cities mapped in 3D with high accuracy. Pixel8 isn’t the only company attempting this so we should see real innovation over the next year. Stay tuned!
-
23:03
And Then What?
sur James Fee GIS BlogTechnology is verbose. There are no shortages of superlatives that help define the solution. It becomes almost noise when you are looking at what truly this solution solves, or even has a problem defined. Just drop something in something and then something could happen. I’ve spent a career trying to help fight through this noise and in the end one question should always come up.
And then what?
Yes you can spend millions of dollars on what seems like the perfect application, workflow or cloud-based solution, but after you get it all done, what then? We deal with this all the time on our own, part of why I’ve left digital note taking is because the “And then what?” of putting all that effort into getting text into a smartphone is either non-existent, worthless, or unneeded.
Throwing money at the problem
So much money has been wasted on “solutions” that “revolutionize” the “process”. Being able to answer the question above is how you get the best value out of the proposal. Dead projects, software not being used, databases withering on the server all happen because the users of the tools have no idea what to do with them when they are done. Time for this madness to stop.
-
22:19
Uber and Google Sign 4 Year Agreement on Google Maps
sur James Fee GIS BlogThis is one of those surprised/not surprised things.
Uber Technologies Inc. announced that it has entered into a Google master agreement under which the ride-hailing company will get access to Google Maps platform rides and deliveries services.
I mean today Uber uses Google Maps with their app, even on iOS. This is basically a continuation of the previous agreement with some changes that better align with how Uber does business. Rather than number of requests that Uber makes for Google Maps services, it is based on billable trips that are booked using Uber, a much more manageable deal for Uber. Last year, it came out that Uber paid Google $58 million over the past 3 years for access to Google Maps. This quote really strikes me as bold:
“We do not believe that an alternative mapping solution exists that can provide the global functionality that we require to offer our platform in all of the markets in which we operate. We do not control all mapping functions employed by our platform or Drivers using our platform, and it is possible that such mapping functions may not be reliable.”
For as much money Uber has invested in mapping, they don’t believe their technology is reliable enough to roll out to the public. That is mapping services in a nutshell, when you business is dependent on the best routing and addressing, those businesses pick Google every time. All that time and effort to build a mapping platform and they still pay another company tens of millions of dollars.
I’ve read so much about how Uber is about ready to release their own mapping platform run on OSM. But in the end the business requires the best mapping platform and routing services and clearly nobody has come close to Google in this regard. Google Maps is not only the standard but almost a requirement anymore.
-
23:20
Abandoning the Digital Notetaking
sur James Fee GIS BlogA couple months I made one last attempt to enjoy taking notes digitally. I used a combination of Github, Microsoft VS Code and VIM to make my notes shareable and archivable across multiple platforms. As I expected, it failed miserably. It isn’t to say that Github doesn’t do a good job of note taking, just the workflow is wonky because that is what technologists do, make things harder for them because they can.
The thing is though, I find myself taking less notes now than before, because of the workflow. Just because I can do something doesn’t mean I should. Moving back to analog is usually a good choice, how often do I need to search my notes? Rarely, I mostly look at the dates and then go from there.
My workflow has now standardized to using the Studio Neat Totebook which I enjoy because it is thin, has the dot grid that gives note taking flexibility and has archival stickers so I can put them on the shelf like my old Field Notes. Why did I not go back to them? I find their size for normal note taking too constrictive, but that’s just me. The size of the Totebook is just right, small enough to not be to big, but big enough to not be too small.
I still use the same pens I’ve been using for years, I feel like they don’t smear and don’t cost a bundle if you lose them and have a bit of friction on the writing that makes it much easier to control. Pens are more a personal preference, it’s hard to move between them as easily as paper. Find a pen you like and stick with it.
I’m just done with Evernote, Bear, OneNote and all the rest of platforms I’ve spent years trying to adapt to.
-
16:28
Long Term Storage of Spatial Data
sur James Fee GIS BlogFollowing on with yesterday’s blog post, I’m also concerned about where I’m storing the data. Until this month I stored the data in Dropbox. I can’t recall when I signed up for Dropbox, but I’ve probably paid them over $1,000 for the privilege of using their service. As with most SaaS products, they start trying to help consumers and then they pivot to enterprise. That’s what Dropbox is doing and I’m tired of it. Their client software is just a hack and there are too many other solutions that better fit with my budget needs than a stand along cloud storage solution.
So as of May 2020, I no longer pay Dropbox $99/year. I’ve moved all my data to iCloud because I do pay for 2TB of storage there (Family plan) and it integrates better with my workflows. I could have put it in Google Drive too, but I’ve never liked how it works which is a shame because it is easy to share with other users. But this isn’t archival by any means. All I’m doing is putting data on a hard drive, though a virtual hard drive in the cloud. It gets backed up sure, but there isn’t any check to make sure my daughter doesn’t drag the data to the trash and click empty. A true archival service is one that makes the data much safer than just storing it in a folder.
Now back in the old days, we used to archive off to DLT tapes and then send those offsite to a place like Iron Mountain. Eventually you’d realize you needed a restoration and the IT guy would request the tape/tapes come back from offsite and restore them to a folder that you could access. Hopefully they were in a format you could read, but generally that wasn’t too much of a problem, there is a reason though we kept a Sun workstation around in case we needed to restore data from ARC/INFO on Solaris. The good thing about this is that that data was always a copy, sure the tape could get damaged, but it was offsite and not prone to being messed with. If I needed data from October 2016, I could get it. Of course, eventually, old tapes were destroyed because of space needs but generally it was a great system.
I’m doing the math in my head as to the cost of those DLT tapes
Now I’m not thinking I need to get a DLT tape drive and pay Iron Mountain for this privilege, but I do need to get data off site and by offsite I mean off my easy to access cloud services (iCloud, Google Drive, AWS S3, etc). I have been working with Amazon S3 Glacier and it has been a really great service. I’ve been moving a ton of data there to not only clean up my local drives and iCloud storage, but ensure that that data is backed up and stored in a way that makes it much safer than having it readily available. Not Glacier is easy enough to use, especially if you are familiar with S3, but you don’t want to throw data in there that you need very often because of how it is costed. Uploading is free, and they charge you $0.004 per GB/mo which is insanely low. Retrieval is 3 cents per GB which is reasonable and after 90 days you can delete data for free.
Glacier isn’t new by any means, I had been using it to archive my hard drives using Arq but not this specifically using projects. I’ve just started doing this over the weekend so we’ll see how it goes but I like that the data is in a deep freeze, ready to be retrieved if needed but not taking of space where it isn’t needed. I’ve also set a reminder in 2 years to evaluate the data storage formats to ensure that they are still the best method moving forward. If I do decide to change formats, I’ll continue to keep the original files in there just in case the archival formats are a bad decision down the road. Storing this all in Glacier means that space is cheap, and I can keep two copies of the data without problems.
-
17:00
GIS Data Formats and My Stubborn Opinons
sur James Fee GIS BlogTaking this break I’ve been looking over my spatial data and trying to figure out how to best organize it. The largest public project I manage is the GeoJSON Ballparks and this one is easy to manage as it is just a Git repository with text files. GeoJSON makes sense here because it is a very simple dataset (x/y) and it has been used for mapping projects mostly which makes the GeoJSON format perfect. I used to maintain a Shapefile version of it in that repository but nobody ever downloaded it so I just killed it eventually.
But my other data projects, things I’ve mapped or worked on the past are in a couple of formats:
VECTOR
- Shapefile
- File Geodatabase
- Personal Geodatabase
- GeoJSON
- KML
- SpatiaLite
Raster
- TIFF (mostly GeoTIFF)
- Esri Grid
Now you can tell from some of these formats, I haven’t touched these datasets in a long time. Being Mac centric, the Personal Geodatabase is dead to me and given the modification dates on that stuff is 2005-2007 I doubt I’ll need it anytime soon. But it does bring of the question of archival, clearly PGDB isn’t the best format for this and I probably should convert it soon to some other format. Bill Dollins would tell me GeoPackage would be the best as Shapefile would cause me to lose data given limits of DBF, but I’m not a big fan of the format mostly because I’ve never needed to use it. Moving the data to GeoJSON would be good because who doesn’t like text formats, but GeoJSON doesn’t handle curves and while it might be fine for the Personal Geodatabase data, it doesn’t make a ton of sense for more complex data.
This is as close to a shapefile icon as I could find, tells you everything doesn’t it?
I’ve thought about WKT as an archival format (specifically WKB) which might make sense for me given the great WKT/WKB support in databases. But again, could I be just making my life harder than it needs to be just to not use the GeoPackage? But there is something about WKT/WKB that makes me comfortable for storing data for a long time given the long term support of the standard among so many of those databases. The practical method might be everything in GeoJSON except curves and those can get into WKT/WKB.
Raster is much easier given most of that data is in two fairly open formats. GeoTIFF or TIFF probably will be around longer than you or I and Esri grid formats have been well support through the years making both fairly safe. What are some limits to data formats that I do worry about?
- File size, do they have limits to how large they can be (e.g. TIFF and 32-bit limit)
- File structure, do they have limits to what can be stored (e.g. GeoJSON and curves)
- File format issues (e.g. everything about the Shapefile and dbf)
- OS centric formats (PGDB working only on Windows)
I think the two biggest fears of mine are the last two, because the first to can be mitigated fairly easily. My plan is the following; convert all vector data into GeoJSON, except where curves are required, I’m punting curves right now because I only have 3 datasets that require them and I’ll leave them in their native formats for now. The raster data is fine, TIFF and grid is perfect and I won’t be touching them at all. The other thing I’m doing is documenting the projects and data so that future James (or whomever gets this hard drive eventually) knows what the data is and how it was used. So little of what I have has any documentation, at least I’m lucky enough the file names make sense and the PDFs help me understand what the layers are used for.
One thing I’ve ignored through this, what to do with those MXDs that I cannot open at all? While I do have PDF versions of those MXDs, I have no tool to open them on Mac and even if I could, the pathing is probably a mess anyway. It bring up the point that the hardest thing to archive is cartography, especially if it is locked in a binary file like an MXD. At least in that case, it isn’t too hard to find someone with a license of ArcMap to help me out. But boy, it would be nice to have a good cartography archival format that isn’t some CSS thing.
-
1:24
Esri Community Maps Data Sharing
sur James Fee GIS BlogI’ll be honest, I really don’t follow Esri as closely as I used to. Not so much in that I don’t care to learn about what they are working on, more just that they do so many more things these days. It’s honestly hard to follow along sometimes, but every once in a while I see something that catches my eye.
Esri is now offering a new option in our Community Maps Program for contributors to have Esri share their data with selected Esri partners and other organizations (e.g. OpenStreetMap) that maintain popular mapping platforms for businesses and consumers. If contributors choose to share their data with others, Esri will aggregate the data and make it available to these organizations in a standardized way to make the data more easily consumable by them and accessible to others. It will be up to those organizations whether they choose to include the data in their mapping platforms. Where the data is used, attribution will be provided back to Esri Community Maps Contributors and/or individual contributing organizations.
Community Maps Data SharingI have to admit this intrigues me. Not so much that Esri is trying to insert themselves into a process, but that it makes sharing data easier for users of Esri software. In the end that’s probably more important than philosophical differences of opinion about closed fists and the such. The data is shared via the CC by 4.0 license that Esri uses for the Community Maps AOIs. I really like this, anything that helps share data much easier is a good thing for everyone, including OpenStreetMap. I’m sure we’ll hear more about this during the Esri UC later this month but it’s still a great announcement. I’ve always been a big users of OSM and getting more organizations to update their data in OSM is a huge win in my book.
-
17:00
It is Different With COVID-19…
sur James Fee GIS BlogI started blogging in May of 2005. Right before Katrina hit and everything we knew about GIS disaster response changed. Katrina was that moment where the static image PDF of a map changed to a map service that ran on almost any modern (at the time) web browser. Immediately every GIS map server that was out there became irrelevant at best, dead to the world at worst. Remember though, Google bought Google Earth almost a year before Katrina and Google Maps didn’t launch until early 2005. The tools that created this disaster response revolution were in place, but not too many people used them or had heard of them. But less than 6 months after Google Maps hit the web, Katrina response was almost entirely driven by their tools.
Remember this? Don’t try and pan!
If you look at my blog entries from September and October, you can see attempts by Esri, Microsoft, Yahoo! and others to try and address this new paradigm of mapping but none of them stuck. Everyone, and I mean everyone, was using Google. Esri ArcScripts back then probably had 50 tools to convert SHP to KML or MXD to KML. We had tools like Arc2Earth that specialized in making maps easier with Google. And while Esri tools were still being used to generate the data, the display was happening on other platforms.
This of course gave rise to the Neogeography revolution. I’ll spare you the bare breasted Andrew Turner graphic but at this time we had so many people doing things with GIS that had no idea what GIS was let alone what Esri was. The limitations on getting started with mapping went down and all you needed was a computer and a text editor to make a map. My blog is littered with examples of Neogeography, from EVS Islands to all that great Flickr mapping that Dan Catt and crew did back then. People didn’t ask for permission, they just did it. It all culminated in what I consider the greatest crowdsourced disaster mapping effort, the wildfires in San Diego back in 2007 (feel free to choose the Haiti response over this, that’s fine. I really like the example of using Google My Maps in your backyard for this).
In all fairness, Andrew wasn’t literally saying it killed GIS.
But something happened after this, it isn’t that people stopped mapping. Look at OSM growth. The amount of crowd sourced data continues to grow exponentially. But responses to disasters seemed to be run by Google and Microsoft themselves. Tools like Google My Maps continue to exist, but I truly can’t recall using one in the past 10 years. Or if the disaster was not interesting enough for Google, you’d see people using government websites to get that information. The Esri mapping had finally caught up that people would use the fire maps from the DOI other 3 letter agencies without complaining. The citizen effort moved to Twitter where it continues to show great promise, just not as a Google My Map. Take a look at the Bush Fire here in Arizona on Twitter. So many great posts by people but maps are either static images shared or links to traditional InciWeb maps.
12 News viewer Timm Chapman shared this photo of the Four Peaks glowing from the growing #BushFire. MORE: [https:]] #BeOn12 pic.twitter.com/HFJox1Wwo0
— 12 News (@12News) June 18, 2020This brings us full circle to COVID-19 mapping. Think of the best and most up to date COVID websites. They are built on Esri technology. Google has websites, Microsoft has them too. But the Esri dashboard has finally had its moment in the sun. I wonder if this is because the market has matured, that the tools have matured or the data set lends itself to a more scientific approach to display rather than simple lines and points. The Johns Hopkins COVID-19 Maps & Trends website is the bible for this epidemic.
GIS is no longer a side show on this response. I’m guessing that because this is more structured government data, Esri is uniquely positioned to be in the middle of it but even then, their tools have come a long way from the ArcIMS/ArcWeb madness that we dealt with during Katrina. COVID-19 dashboard is the opposite of Neogeography and that is OK. The influence of the citizens on mapping is clearly shown in the Esri tools we deal with today. They still drive me nuts from time to time but let’s be honest, they really do work for this situation. As we close out 1/2 of the way through 2020, hopefully we can keep the need for disaster response to a minimum.
-
19:13
Facebook Acquiring Mapillary is More Than You Think
sur James Fee GIS BlogI’ve been working on this blog post all weekend and I’ve rewritten is many times. It comes back to the confusion about why Mapillary and Facebook are now part of the same team. I wrote down about 10 guesses as to why Facebook decided it needed Mapillary and they needed them now but Joe Morrison did such a a good job outlining many of them I’ll share it here. Go read and come back after you’re done, I’ll wait.
Welcome back, now what do I think about this? Hard to say honestly, I can talk myself out of any idea. Get back at Google? I don’t think things are that emotional, sure they probably should own their own mapping solution as sending all their users on to another platform is leaking out their secret sauce and probably a boon for Google. But this isn’t something they haven’t been working on and I can’t see how as amazing Mapillary is, that it moves the needle on this at all. Any work toward a Facebook Maps platform has been done and is probably close to happening. I could see that amazing Mapillary team being an acqui-hire that could help in the long term given their expertise with Open Street Map.
Computer vision, AR/VR and the rest *could* be a reason but remember that Facebook owns Oculus and has done so much in AR that again Mapillary is a rounding error on this. While Oculus has not paid out the way I’m sure Facebook hoped it would, the engineering and development teams there clearly have influenced Facebook. Mapillary, as amazing as those guys are, just don’t have the horsepower that existing AR/VR/CV teams do at Facebook. Again, maybe an acqui-hire.
Place database is of course the holy grail of mapping. The maps are a commodity, but the places are not. But let’s be honest, there are very few companies that have better place data than Facebook. They might have not had street level view data but they sure had more pictures of these venues than almost anyone else. I get that people like street view data but how often do people really say, let me see a street view image from 2011 when they are look at directions. THEY DON’T. Street view is the coffee shop mapping example. It sounds interesting, looks great in demos but in the end not as important as a 3D world built from satellite imagery and lidar. But wait, that’s where Mapillary does come in.
The mostly likely reasons I feel that Facebook bought Mapillary was because of their expertise with Open Street Map and OpenSfM. Facebook is one of the largest users of OSM out there so bringing in a group that is as if not more experienced with OSM helps move the needle with their mapping efforts. The second thing Mapillary brings is their skill making 3D worlds out of imagery. As I said, who has better pictures of venues than Facebook? Start stitching those together and you get an amazing 3D city that is updated quicker than driving stupid cars down streets. Encourage people to take pictures and they update the 3D world for you. That and they they get some of the best OSM ninjas out there all at once.
Now what happens to the crowdsourced data? Will people continue to participate given there are few companies who are more reviled for data management than Facebook? That is what I’m most interested in, Mapillary the product, does it continue? Time will tell.
-
4:41
Shifting Gears
sur James Fee GIS BlogToday was my last day at Spatial Networks, which many of you know as the creator of Fulcrum. Back in early 2019 when I left Cityzenith, TQ asked me if I would join the team to help out with the Professional Services. I could list all the great people here I worked with but you all know them already so just take this as my thanks to them for the great time. I wish everyone there the best and hope they continue their journey toward something amazing.
Vacations aren’t what they used to beMyself, well usually when I leave a company I take a vacation (Hawaii for WeoGeo, honeymoon for AECOM and Snowboarding for Cityzenith), but between COVID and weather, I’m sticking home. My wife joked that I always try and go to Hawaii after a job but alas not this time. But that is OK because I’m not interested in waiting for my next job, I’m actively looking. Summer is here so I’d rather be working on something amazing than sitting outside in the pool. So if you’re looking for someone to help you, send me an email.
You can also sign up for my newsletter, I’ve got the next one coming out tomorrow morning!
-
16:25
Automation or Scripting
sur James Fee GIS BlogWhen I think back to my first exposure to GIS, it is through ARC/INFO. Just me and a command line. Everything was written in AML which made everything I created a script or even an app if you take the parlance that seems popular these days. I’ve beaten the drum about scripting and GIS so much on this blog that I feel like I don’t need to rehash it except to say that if you ain’t scripting you ain’t living.
But is scripting as important as it once was? I scripted AMLs because that was the only way short of typing in commands one at a time to build anything, and you sure as heck couldn’t visualize anything without AML (well you could, but not in anyway that you’d share). Do we script as much anymore? I was looking at my automations in my life last night and there is so much that I use Zapier for that there really isn’t anything in my house that happens without a trigger. I think today we use works like “automate your workflows” rather than scripting but that is just the low-code ontology that permuted into our vocabulary.
Regardless, the future of GIS is not scripting. That is writing Python or JavaScript and then running that file to see a result. It will be taking triggers and attaching them to actions to see results. The best part of this is that it isn’t hard coded to anything, they just wait for something to happen and then do something.
You just take an trigger and attach an action.
GIS really is set up for this, almost everything you do is an action. The trigger is your mouse button but do you really want to be clicking your index finger all your life? But don’t be sad, this future doesn’t devalue your experience, it enables you to bring it to where it is needed. Output of GIS is more likely to be Salesforce or a BI tool than a PDF moving forward. That’s the biggest win for everyone.
-
23:41
15 Years of Spatially Adjusted
sur James Fee GIS BlogHard to believe Spatially Adjusted gets it’s driver learners permit next year, but it’s true. Hard for me to believe that I was sitting on a ranch outside Brownwood, TX (on AOL dialup no less) thinking about how to learn more about open source GIS software. For reasons I cannot remember, I thought why not blog about it. This blog has been in my life for so long I really can’t recall what I did before I had it. But hey, I’m so happy to have written all these blog posts, even the bad ones, because I have learned so much.
Me taking the time to post only the best ideas…
I can’t even imagine what the next 15 years will be like, but we’ll leave that up to the future. While I don’t post here as much as I used to, feel free to subscribe to my weekly newsletter below, where I attempt to keep up with my off base opinions.
Spatial Tau Signup
https://www.getrevue.co/profile/james@fee.fm -
21:02
PostGIS in Action – Third Edition
sur James Fee GIS BlogIf there is one book I’d recommend anyone to get in our industry this is it. Way back in 2009 I wrote about the first edition:
Looking at the table of contents reveals that this should be the book for learning how to use PostGIS in your GIS applications. I’m really intersted in Chapter 13, “First look at WKT raster”.
Of course that book was on my desk for years and eventually it was updated in 2013. But that was over 7 years ago, technology changes and so has PostGIS. You can now get the long awaited 3rd Edition in the Early Access Program. I’ve started reviewing it and there is so much in there as this is going to be a significant update. PostGIS 3.x should be a big deal in it as well as PostgreSQL 12.
Buying this book is a no-brainer for anyone.
-
14:01
Italian Baseball Stadiums in GeoJSON
sur James Fee GIS BlogI’ve been working at cleaning up all the GeoJSON-Ballparks records this past month. While the MLB stadiums and many of the AAA minor league teams have been updated, the international and small market teams have not. Some were out of data by almost 5 years. Long tail baseball stadiums are what they are and I’m working on automating much of this moving forward. The last two leagues that I’m updating are the Italian Baseball League and the German Bundesliga League. I hope to finish Germany tonight but I did post Italy yesterday.
Ballparks of the Italian Baseball League in GeoJSON
While Italy can’t go out and enjoy baseball just yet, at least their top tier baseball league has been mapped. If you’re looking for some live baseball, check out streaming Korean KBO League league (I’ve been watching the Giants of course). The next live stream will be on March 25th at approximately 7:40pm PDT.
-
4:03
No More Boot
sur James Fee GIS BlogSo the doctor said my boot can come off my foot. Things are looking really good which I can’t tell you how happy that makes me.
So I’m not clomping around all over the place which should make my wife much happier.
No, this wasn’t me exactly, but imagine the sound and you’ve got it. Those hardwood floors I put have an amazing echo…Well I’m not out of the woods yet, I still can’t do much beyond walk and even that still hurts. But at least I don’t wake the baby girl as I walk around the house. I see the doctor in 3 more weeks (assuming the world doesn’t come to an end) and then maybe I start therapy.
-
4:37
Bing Maps and COVID-19
sur James Fee GIS BlogUpdate (03/23/2020): It looks like Microsoft has made some good changes. They now tell you when the data was updated and the news links work much better. I can see myself using this over other maps because it is so simple. Simple is quick.
There are no shortage of COVID-19 maps online. EVERYONE has one so why even bother posting one? Well I looked back on my blog posts and the last time I posted anything about a Microsoft map was back in 2006 and that was when it was called Virtual Earth. Well here it goes, the Bing COVID Tracker.
We finally Binged COVID-19
According to the info section, the data is from the CDC, WHO, ECDC and Wikipedia. It is a pretty bare map and if you didn’t see the Bing logo in the upper left or the Microsoft credits in the lower right, you might not even think it was a Microsoft product. There is no notification as to how often the data is updated but it appears that when I’m checking at the time of this post, it is current for the past hour. If you click on a state you get information about the cases and links to news articles about COVID-19 in that state. Pretty basic as you can see from the Arizona view below. COVID and Biden…
This feels more like a mashup than a multi billion dollar companies attempt to education the public to the threat of COVID-19. A real shame as events such as this usually bring out the best in technology, this attempt by Microsoft feels so very dated. At least I got to post my first Bing map….
-
15:00
Self-quarantine Blog Cleaning
sur James Fee GIS BlogI’ve spent the last week cleaning up old blog posts. You won’t see them live just yet, I have a dev version of this blog that I’ve been playing around with. What I’ve done is search all posts for any links that don’t go 200">[HTTP_status_codes#2xx_Success">200] and then either attempt to find a corresponding version in the Way Back Machine or if a source doesn’t exist (using CSS) make a modification to the post indicating that the link is no longer valid.
Touching 2400 blog posts is dangerous stuff…
I’m also cleaning up the categories and tags which I think have little value anymore. I think at one point people subscribed to tags/categories in a blog, but search really has taken that over. So the complexity of tagging or categorizing posts really doesn’t make sense. I mean I like to think someone is coming to my blog and saying, “Hey look there, he’s got a category for Virtual Earth” but I seriously doubt it.
I’m also playing around with AWS Lightsail. I’ve been using it for this dev version of the blog and I might try it with production. Linode seems to be cheaper, but I like having more of my things on AWS rather than a little bit all over the place.
I hope to push up the changes to the blog here this week. I really feel that given I have almost 2400 blog entries with over 400,000 words. Checking old blog posts has really shown me how much we’ve lost. So many people, blogs and information has been lost forever. This is a shame because I learned so much by reading what others had written. I hope that while not every one of those 400,000 words has value, the majority of this blog helps people and in turn, preserving what I’ve written will always be a priority.
I had to have written something at some point worth saving…
-
22:49
Surreal Saturday
sur James Fee GIS BlogIt’s hard not to read about all the doom and gloom (rightly so) that is about to head down on this great country. I’m expecting next week to make last week look boring. I’m still stuck in the house with my foot but the weather finally cleared today and I’ve been sitting on the patio look at the golfers on the 9th green. Whatever madness is going on elsewhere, right here the world is normal and beautiful. Boy do I hope it stays that way.
9th green on the Arroyo Course at Gainey Ranch GC
-
17:00
Friday the 13th
sur James Fee GIS BlogSo what a week, am I right? I’m trying to put everything into perspective but I can’t. I just went down to the Safeway to get some beer and it was like end of days. Apparently I should be stocking up on beans and tuna fish by looking at the isles. Top it all off, we’ve had more rain in this past week than I can recall, thanks California.
Since there is nothing to do anymore and I probably don’t want to be around people right now, I think I’m going to work on cleaning up my old blog entries and fix deal Google links this weekend. I’ve moved this blog so many times and so many different blogging engines, that many entries go 404. I think I’m also going to try and relink dead links using Archive.org so the context of what I was linking to still works. This is a daunting task of course, I have almost 2400 blog posts to go through.
Stay safe everyone!
-
15:28
Foot Surgery Self-quarantine
sur James Fee GIS BlogI felt sorry for myself with my achilles surgery. That I couldn’t go out, see spring training games, have brunch in crowded restaurants with all the spring tourists.
No, I picked the best time to get my foot operated on. I’m able to give it the rest it needs because there is no where to go. I feel lucky that I was able to time this so perfectly for myself and my family.
Please, everyone. Stay safe!
-
3:46
The Story Behind Earthgoogle
sur James Fee GIS BlogIf you search my blog you’ll find an interesting post titled earthgoogle. Well it really isn’t that interesting, it just has a link to download Google Earth and a link to my blog. So what is this thing and why does it have such a weird title?
For those that might not remember, 2005 was a crazy time for GIS blogs. Katrina brought satellite imagery to everyone and people searched the internet for ways to find out more. Google Earth was probably the easiest and best way for the average person to learn more about satellite imagery and get some really helpful tools to mark up the area.
This was about as amazing as anything anyone had seen outside of our industry.
About this time in September 2005, I noticed a lot of people arriving to my blog due to the search term “earthgoogle”. So as most people who blogged back then, I loved to talk about blogging. I created a simple blog post asking what was this all about.
To all those reaching this site using MSN search with the term “earthgoogle” hello. You’ve been filling up my server logs with this request. I’m curious why you’ve typed this in to only MSN search and not Google/Yahoo/other search engines.
So obvious, right? MSN users, not typing a URL correctly? Anyway, what this blog post of mine actually did was make this page the number one result in Google for the search term “earthgoogle”. I got so much traffic by being the way most people, who didn’t understand how URLs work, find Google Earth. Eventually I changed the page to what you see now.
I put Google AdSense on that page too. I mean everywhere (really wished I took a screen shot because it was so tacky). The result from that tacky was that I was making over $1,000 a month in ad revenue from that blog post alone. People who wanted to find “earthgoogle” apparently also like to click on ads.
Eventually the page died down, people stopped being directed to my blog via search for “earthgoogle”. I probably pulled ads off the blog in 2006 and couldn’t care less. But the page remains, a reminder of how crazy Google Earth was back in 2005.
-
6:22
Software That Changed Your Life – 2020 Edition
sur James Fee GIS BlogWay back in 2006, I wrote a blog post called Software that Changed Your Life.
Well that might be a big title for this post, but I was talking with some folks over the weekend about software you’ve used or software that has really influenced your life. I think many people say Google Earth has changed how they view data, but for me it really wasn’t that impressive since Google Earth is more of a validation of what we’ve done over the years than a life changer
I thought it would be fun to look at how things have changed since then. My job is very different, I can’t remember the last time I created a map or changed cartography in a mapping product. I think one can look at that 2006 list as how I got to the point that I lived the rest of my life. So here is the updated list:
- HyperCard – I just can’t stress enough how much this changed my life. The concept of a database and visualization. The scripting language on the backend, and everything that eventually become the web (buttons, forms, etc) on the front. I’d like to think that I would have learned to program a different way, but teaching myself Hypercard is exactly how I go to where I am today.
- BBEdit – to this day I still use BBEdit. I think I purchased my first copy back in about 1994 and I’ve used it probably every day since then. I’m sure I’ve used every text editor. Today I use BBEdit, VS Code and of course Nano, yet I find myself in BBEdit more than anything else. I taught myself Grep using BBEdit and probably after a hypertext markup language, Grep has done more for me than just about anything. From JSON to Python, from CSS to GeoJSON, from JavaScript to Perl, I write it all right here.
- Perl – I was going to put JavaScript here. I probably should have put JavaScript here. But I have to be honest, the scripting language that got me thinking about scripting was Perl. I rarely use it anymore, other than pulling some script out of a folder and running it one off. I use Python more for my scripting or JavaScript. But from the time I bought the first edition of Programming Perl I was hooked.
- PostGIS – So another one I thought about. Elastic? MS Access? DBF? SQL Server? I mean what database should be the one that changed my life. It has to be PostGIS. Without it I would probably have put MySQL right here. But no, it’s PostGIS. The reason this blog was created was to learn more about PostGIS and how to get that damn thing installed on Windows Server. Some day on my newsletter I’ll write about the impact of Simple Features for SQL. From the moment in 2005 when I got PostGIS working until today, I’ve always had PostGIS running somewhere near me.
- Safe FME – Sadly I don’t use FME anymore. But let me be crystal clear here. There is no better tool out there to help you manage data. I probably should find myself a copy of it and run it again. At WeoGeo we used it for everything. I’ve used it while at Architecture firms, Engineering firms, startups and in between. Data is agnostic and using a tool that is helps keep the integrity of data. Before FME I spent so much time trying to keep all the data in one format and in one projection (I was young, let me be), but when I was able to drag a reader on to a workspace, throw up a transformer and then connect that to a writer, I was hooked. FME should be standard issue for any true Geospatial data user.
Some other software that didn’t make the list but could have and I didn’t mention above? GDAL/OGR, Tippecanoe, ArcGIS, Excel, Google Earth and Photoshop. Such a personal list and one I think changes over time. I think the core of what makes me who I am is up there, but it is also up in that 2006 list too. For fun you can look at the Way Back Machine and see the comments on that blog post. I see Sean Gillies, Morten Nielsen, Brian Timoney, Steve Pousty, Bill Dollins, and others in that list.
Don’t forget to subscribe to my newsletter SpatialTau, the first edition goes out tomorrow morning. Every week on Wednesdays moving forward! Sign up below!
Email address (required) First name Last name SubscribeSubscription received!
Please check your email to confirm your newsletter subscription.
-
3:18
Back in the Day
sur James Fee GIS BlogThis is probably 80% of what I remember about ArcView 3.x. I also remember VTab and FTab.
At least we could open the APR in Notepad….
-
20:50
PHXGeo
sur James Fee GIS BlogI started the PHXGeo Meetup group way back in March 2013. It actually wasn’t the first version of PHXGeo, that was started in November 2010 and that group actually put on the first and only WhereCampPHX. You can see what it looked like back, then. When I left WeoGeo in 2013, I decided to move from that WordPress site to Meetup to better help run the group. Eventually I turned over management of PHXGeo to Ryan Arp sometime in 2016 and set off to the land of CAD and BIM. He’s done a great job keeping it running along with the help of some great volunteers which warms my heart.
There was a meetup last week, but alas with my foot I was going no where. But I am excited to engage the local geography group again and hopefully help them continue to grow it. If you live in the Phoenix metro area, make sure you join the meetup group so you can learn more about what they are up to and when the next meetup will be. Don’t forget, State of the Map US 2020 is in Tucson this year so there is more reason to get excited about Geo in Phoenix.
I still control the PHXGeo twitter account and domain, so if you aren’t a meetup kind of person, look at those and they’ll be updated with the meetups.
-
21:07
Walking Boot
sur James Fee GIS BlogSo the good news is I’m walking after my surgery. The doctor said things are looking good but I’m still very sore. He bent my Achilles forward to see strength and it hurt so bad I almost passed out. I remembered my g-force training and… Well I didn’t remember anything because whatever he told me to do I forgot by the time I got home and had to call him back.
Anyway, I’m klunking around in a boot. And I finally got to take a real shower. Small victories I suppose. Maybe this was perfect timing for my surgery, I’m like in my own quarantine since I can’t drive and I can’t walk more than 100 yards without getting sore.
Spring training continues without me…
-
0:40
The Esri 2020 Dev Summit Has Gone Virtual
sur James Fee GIS BlogLet’s be honest, there is a bit of love for the Dev Summit. Those who attended the first one, we look each other in the eye and do that subtle nod knowing we were part of something amazing.
Now the funny thing about the Dev Summit. I don’t think I’ve been back since 2009. I ended up going a different direction with my career after WeoGeo and while I don’t miss the Web ADF, I do miss the Dev Summit. Well the 2020 Dev Summit has been canceled.
Due to the continuously evolving circumstances surrounding the coronavirus, the Developer Summit will be a virtual event and not a live, in-person conference this year. This was a difficult decision, made after careful consideration for all registered attendees and Esri staff.
Makes total sense. The Business Partner Conference is still going to happen and they are going to take into consideration things:
The events team is working directly with all of our venues to provide readily available hand-sanitizing stations. Alcohol wipes will also be distributed at various locations throughout the event.?The custodial personnel will be regularly disinfecting all common surfaces. Information regarding basic health practices will be displayed on signage that recommends how to avoid the flu and other illnesses.
Boy I can’t imagine going given the BPC is going to be such a small thing but maybe that is what makes it manageable. I know a couple people who have told me they aren’t going to attend the BPC this year, even before the cancelation of the Dev Summit because of COVID-19.
But let’s not focus on the bad, let’s focus on the great outcome of this. They are still going to do a livestream of the plenary as usual and make the sessions virtual. I’m still waiting to see what this looks like but it really could be useful. I’m not saying that conferences don’t have a part in today’s workplace, but having the virtual option helps immensely for those who just can’t break away to learn the latest technology from Esri.
-
5:09
Microsoft Geo
sur James Fee GIS BlogI still see projects now and them that are spatial. I think of the US Building Footprints project and how they had to give away the data and couldn’t monetize such a project. Bing Maps went through so many name changes that we can’t even recall them all. Heck Microsoft bought Nokia but only the phones. They didn’t buy HERE (Navteq) which could have been a great coup for them.
Visual Studio Code… Someone needs to check some files into Git.
I have to admit, I’ve been a user of BBEdit since about 1994, but I’ve found myself using Visual Studio Code much more. If I search my blog posts over the years, there are posts littered with Visual Studio hate. But now I find VS Code to be my go to code editor and not only for programming but also editing GeoJSON files.
But this really has me thinking. Microsoft and Geo really has died. I’m not saying that SQL Server isn’t used for spatial queries. Or that occasionally I see Bing Maps used in apps. But really they have become such an also ran that I really couldn’t even recall the last time I used Bing">[https:] Maps API, let along SQL Server (I actually do recall and it was SQL Azure back in 2016). For a company that really has reinvented itself, they have fumbled what little they had in spatial away.
I’m sure I have a screenshot of Bing Maps, but I didn’t search for very long.
While at Cityzenith, we dealt with CityNext, which had much sway in the Smart City space, but so like depth. I think it was just an excuse to get their name on Smart City conferences.
I have to tip my hat to Microsoft for many things, but in our space, they really have become at best a follower, at worst an also ran.
At least old Gil is trying…
-
4:58
The Newsletter is Back
sur James Fee GIS BlogAs I announced on SpatialTau, the newsletter is back. Before I go on, click and subscribe.
So as I mentioned on that first edition (well I guess it is the 3rd first edition of the newsletter), I’m going to do my long form writing on the newsletter. It will come out every week on Wednesday. The blog isn’t going away. As you’ve seen, I’ve started blogging again. Think of Spatially Adjusted as my relief valve. The place where I let go the thoughts without spell check or figuring out if the Simpsons GIF makes any sense. I’ve been blogging daily, but I suspect it will devolve into couple times a week.
The newsletter is where I want to talk about the industry more. Getting at the why. I look at it as the book I never will write. I hope to get more into the why rather than the how. I figure there are so many more people these days who do the how better than I, I’ll leave that for them. I know that’s not always what you see here on this blog, but that’s what I want to write. Maybe someone will print some PDFS and throw it in a 3-ring binder one day.
I won’t be linking to the newsletter much here so if you want to follow along, please subscribe. I’ll be cleaning up the format a little bit this month and hope to settle into a nice rhythm. So I think I’ve got my bases covered, this blog, a podcast and a newsletter. I’m just a Renaissance man….
-
0:39
Toolkits
sur James Fee GIS BlogBill and I have a podcast that we do almost once a month. Podcasts are a lot of fun because you can talk about things easier than writing about them. There is a free flow of ideas (or maybe garbage) out of your head and on to a mp3. One topic we talked about months ago was GIS clients. We talked about tools we use but I just happened to be listening to it last night and I realized maybe I wasn’t as honest with myself as I should have been.
GIS users, if you need a friend, get a dog.
I’m not a GIS user…
Fair point though, what is a GIS user? I think of it as someone who uses GIS software. But even that it is somewhat of a mess because one person’s GIS software is another person’s toolkit. Ignoring that issue for a second, what do I use for GIS?
- GDAL/OGR
- Turf.js
- Elastic
- PostGIS
I think that pretty much covers it. I mean there is some Shapely and some other libraries, but that short list is all I use anymore. That of course has a lot to do with my job, if I was GIS Manager at the City of Townsville I might need other tools, but that list above is pretty much it. I can’t help but think of these things as Toolkits rather than GIS software. They are all part of a deeper workflow that I use when I need to use it. The end result is never QGIS, ArcGIS, uDIG or whatever madness you use in your daily life. It is either GeoJSON or “database” (where database could mean a lot of things).
God made men. Men made proprietary software systems
This blog is about to have it’s 15th year anniversary and I can’t think of a better example of how things have changed since that moment. I also think GIS lends itself for this workflow orientated environment anyway. Ignoring the crazy ArcGIS Desktop years with wizard based GIS, mostly GIS has been scripting workflows to accomplish your needs. Fortran, AML, Python, you name it. We use these methods to not only get results but document them. In the end I think all the tools we use for GIS are Toolkits and not software. Yes, one must put a name on something, but GIS has always been about toolkits, even in proprietary workflows, and will always be this. Maybe when we check in right before I retire in 2035 we can see how we are doing with this.
My guess? Still using toolkits.
Toolkits are a “real genius” move…
-
1:00
A Calendar is the Last Great Frontier
sur James Fee GIS BlogWorkflow wise, I’m pretty set on many things. I love the Studio Neat Panobook and Totebook for note taking. Both are amazing and I can’t see myself ever leaving them. Task management wise, I used to live in Things and then Todoist and then Things again. But that process taught me one thing about todo lists. They are the worst way to measure your life. They create anxiety and unrealistic goals because they are islands on to themselves. The fix for me was creating a daily goal list the night before and put that in my Panobook. Then I can check those off with a pen and life feels wonderful. For those reminders that need a little reminding, I just use Apple’s Reminders app which does all I need.
Email is awful, but there is no help in sight (maybe Hey will solve this). I suppose Slack owns the chat space. But Calendar is what just fails me every time. I’ve tried all the products. Fantastical, Google Calendar, Timepage, you name it. The simple fact that I can’t list more than a couple here really shows what the problem is. There is nothing to innovate.
I feel like I use a calendar in a couple ways; I put events up such as “Meeting with Bill” or “Connor’s Lacrosse Game”. I put up blocks of time, “Work on proposal for the Queen”. I put up out of office notifications, “Vacation in Hawaii”. The problem with calendars as they are used today, is they all treat those three ways of measuring something the same. Start time, end time, notification. Then we throw up some blocks like a Kanban board and try and find time to get it all done.
I feel like an optimal calendar would be the opposite. Rather than have a blank board to put things on, you should be carving out time to work on things that matter to you. My best work is done in the white space of a calendar and rarely gets recorded (unless I’m charging time to a client).
I probably don’t need a new app to do this. I could obviously start putting time in the calendar to capture these moments of focus. I wonder if the dynamic of thinking of a digital calendar as an old fashioned paper calendar is where we should be. As I said, shame this space hasn’t been explored as well as todo and email apps.
-
15:00
Spring Cleaning During Spring Training
sur James Fee GIS BlogGeoJSON-Ballparks is my favorite data project I’ve been part of. Probably because not only is it the best sport ever, but it is great keeping track of all the changes at ballparks through the years. MLB teams have mostly stopped building new ball parks so the changes are generally just updates to their names. This year the only new name was Truist Park. Oakland Coliseum reverted back from RingCentral which it never was able to become because of shenanigans. We do bring on a new ballpark in Arlington which is named almost the same as the old ballpark (Globe Life Field vs the old Globe Life Park in Arlington). Apparently the old stadium has been renovated to XFL standards so we should probably not call it a ballpark anymore. I just removed the old one since it is no longer a baseball stadium. I did the same thing with Turner Field.
I plan to review all the Spring Training Facilities of the Cactus League and the Grapefruit League and then review the AAA stadiums. We’ll have to see what happens with the MLB/MiLB negotiations. While it doesn’t affect the actual stadium points (at least in the short term, some of the fields could go away because of lack of support), the alignment of teams in leagues could be changed. So stay tuned and if you want to help out with the AAA stadiums, just create a pull request, would be greatly appreciated!
-
19:50
Where I Have my First Surgery…
sur James Fee GIS BlogSo Friday I had my first operation ever. I had never even had an IV put in me. But eventually life catches up with you. I’m going to be 48 this year which is still young but of that age when things start breaking. All those years spraining my ankles playing sports resulted in bone spurs in my ankle that caused me to not only not be able to do may daily running, but stop bowling and even walking without pain. The lucky part was that this was caught early enough that they were able to try some less invasive surgery called Right Gastrocnemius Recession. My non-doctoral explanation is that elongate the tendons on the back of my calf which in turn lesson pressure on my Achilles which should stop rubbing against the bone which would then mean I can run again.
At least I can “step” outside and enjoy the weather.
We’ll have to see if this is actually going to happen, I really hope so because the other surgery (the one what removes the bone) is very intensive and recovery is long. I’m just hopeful that I can start running again, and get back into shape. There are many reasons in 2020 that I need to get back on track.
-
17:00
Google Maps at 15 Years
sur James Fee GIS BlogSo hearing that Google Maps is now 15, you have one of two thoughts. “Boy that’s a long time” or “Boy, that’s a long time”. It really is a long time, this blog isn’t 15 years old yet (but we’re getting close). I thought it would be fun to look back at my first mention of Google Maps:
… ESRI does include metadata with their ArcWeb Services datasets. Take a look at the U.S Street Map Service metadata page. This information is available for every ArcMap service. But it isn’t just ESRI. Geodata.gov has extensive metadata as well as other providers of data (when you get satellite imagery from DigitalGlobe, they give it to you).
About Google Maps hackers just don’t get it – “It’s all moot”So of course my first mention of Google Maps had everything that made 2005 amazing.
- Mention of Esri – yea I used to be “the Esri blogger”
- Mention of ArcWeb – boy I think I was the only one who tried to use that madness
- Metadata – what argument in 2005 didn’t have some amazing metadata reference
The funny thing about this is nobody cares about metadata in Google Maps anymore. It was a fake issue back then, but in the end anyone who needs detailed metadata about imagery, uses a service that has that information in it. The rest of us, just use Google Maps.
-
18:18
Cageyjames & Geobabbler on Elasticsearch
sur James Fee GIS BlogBill and I finally were able to sit down and record another podcast. This one was our white whale, we probably have tried to do this episode since early last summer. But it is done and I think it is a great introduction to Elastic for those who are interested in learning more.
-
23:41
Revisiting Twitter Lists
sur James Fee GIS BlogBack in May I tore up my Twitter and put everything in neat lists. When all was said and done, I had 10 lists with everyone in a neat little bucket. It was beautiful, I could turn to any list and have that twitter hose just give me what I was looking for. But 6 months later I have immense regrets, but not for the reason I though there might be.
When I did this, my big fear was losing connections with people and topics. I moved all my college sports accounts into one list and then I noticed I wasn’t always up on top of the news because there wasn’t any cross-pollination. That is, I would swipe between lists but there were days sometimes where I didn’t review a list and I would miss important things. It was clear, segregation was a bad idea because I no longer had a feed that just rolled everything I was interested in. A weird thing happened during this experiment, my want to be free of the noise meant I wasn’t exposed to any noise. A quiet room is comforting until you realize you are not part of the conversation.
Now the big thing that got me thinking of a new direction was Coleman’s “bestof” list. This best of is perfect because I can bring the best people into a list and let it quickly keep be abreast of the topics I care about. Then I can of course still drop into my baseball or BIM lists if I feel like I need a deeper dive. So while I was traveling back to Spatial Networks HQ on the airplane I created my “what matters” list which basically does exactly what Coleman did. I still have my niche lists but now I have what I was missing and for all the reasons why Coleman liked it too.
But I also realized there was something else I was missing. I got my Twitter follow accounts down below 100. Initially I liked this, meant that I was only following those that I really felt mattered and the rest got put in lists based on their topics (or even in the “what matters” list). BUT, this basically broke a part of Twitter that I didn’t think I cared about. The part of twitter that forces you content was in a way something that I actually used from time to time to find new voices. By giving the beast nothing to churn on, it in turn gave me junk back. So I went head and followed 500 people and what do you know, things are back to normal. While I’m not using the main feed as my way to read Twitter, I can always go there or the “For You” section and see things that I might have missed.
In the end, the change was simple, new best of list and follow the people that matter back. I’ve enjoyed working with twitter again, and I still can limit any list I wish when I don’t want noise. I may unfollow some people, add some more but this seems to be the best compromise. I’m no longer bankrupt was I was with Twitter in May 2019, but I also am not on my own island. Time will tell if this was a good idea…
-
22:04
GIS for Math
sur James Fee GIS BlogThere was great reflection over Thanksgiving at my house.
I got into GIS partly because I love maps but mostly because I love using math to solve problems. Nothing makes me happier than running some analysis and seeing the results. pic.twitter.com/cnOW0Bk1Ah
— James Fee (@jamesmfee) December 2, 2019Well maybe that is hyperbole but I was asked how the heck did I get myself where I am today. I think I’ve told this story many times before on this blog, but one more time won’t hurt. I was working toward a degree in Economics when statistics classes his my schedule. I really took to these and started to try and take as many as I could before I graduated. One of these was given by the Geography Department at Arizona State University. The name of the course has been lost to time but I do recall they used SPSS which I despised. The kicker though was the TA for that class introduced me to Perl and that was the introduction to the freedom that open scripting tools can give you.
Maps have been something as a kid I loved, like you I read the atlas and the Thomas Brothers Guide, but math and statistics is what drew me to GIS. SPSS and Perl are no longer part of my toolset (thank god honestly) but the skills I learned back then still make calculations in GIS analysis much easier for me. Cartography is the tip of the iceberg with GIS, the math is what makes it sing. Don’t forget that.
-
23:32
103 Days Until Spring Training
sur James Fee GIS BlogA good friend of mine texted me this morning with literally this…
So… What do we do now?
Exactly, 103 days until pitchers and catchers show up at Spring Training.
-
20:59
Game 7
sur James Fee GIS BlogWhile it isn’t the Giants and it isn’t an even year. There is always something about a game 7. All these games, all these days and nights. Down to at least 9 innings. Unless of course you’ve got something else going on…
-
2:38
CnG Podcast Episode 6
sur James Fee GIS BlogSo Bill and I put out another podcast. This gets a bit retro when we start talking about the shapefile going away and Blackberry maps.
-
19:49
Sidewalk Labs’ Replica Has Spun Out
sur James Fee GIS BlogSome really interesting news in the digital twin planning space from last week:
The newly formed company, which is headed by Nick Bowden, also announced Thursday it has raised $11 million in a Series A funding round from investors Innovation Endeavors, Firebrand Ventures and Revolution’s Rise of the Rest Seed Fund. The capital will be used to accelerate Replica’s growth through new hires beyond its existing 13-person staff, expansion to new cities and investment in its technology.
What makes this interesting is what Replica is:
The Replica modeling tool uses de-identified mobile location data to give public agencies a comprehensive portrait of how, when and why people travel. Movement models are matched to a synthetic population, which has been created using samples of census demographic data to create a broad new data set that is statistically representative of the actual population.
How, when and why people move around a city.
As a planner, investor or developer; you can imagine how this is really interesting. As the TechCrunch article points out, there are privacy implications to this but if this model works and can help plan cities better, we’ll all be better off. Cities are growing at exponential rates and new ones are being built every day. Helping planners make better initial decisions about where and how things should go OR help them make changes as the city develops will only improve life for all.
-
15:30
The iPhone U1 UWB Chip, Digital Twins and Data Collection
sur James Fee GIS BlogOddly enough the biggest news this week from the iPhone 11 introduction by Apple barely got any play. In fact, on the iPhone 11 Pro website, you have to scroll past Dog Portrait mode to get any information about it. Apple describes the U1 chip thusly:
The new Apple?designed U1 chip uses Ultra Wideband technology for spatial awareness — allowing iPhone 11 Pro to understand its precise location relative to other nearby U1?equipped Apple devices.4 It’s like adding another sense to iPhone, and it’s going to lead to amazing new capabilities. With U1 and iOS 13, you can point your iPhone toward someone else’s, and AirDrop will prioritize that device so you can share files faster.4 And that’s just the beginning.
[https:]]Makes sense right? A better way to AirDrop. But there is so much more there, “precise location relative to other nearby equipped Apple Devices“. But what is UWB and why does it matter? The UWB Alliance says:
UWB is a unique radio technology that can use extremely low energy levels for short-range, high-bandwidth communications over a large portion of the radio spectrum. Devices powered by a coin cell can operate for a period of years without recharge or replacement. UWB technology enables a broad range of applications, from real-time locating and tracking, to sensing and radar, to secure wireless access, and short message communication. The flexibility, precision and low-power characteristics of UWB give it a unique set of capabilities unlike any other wireless technology.
So that’s really interesting, low energy use, high bandwidth and is very secure. I thought Jason Snell did a great job looking into the U1 on Six Colors:
From raw data alone, UWB devices can detect locations within 10 centimeters (4 inches), but depending on implementation that accuracy can be lowered to as much as 5 millimeters, according to Mickael Viot, VP of marketing at UWB chipmaker Decawave.
That’s pretty amazing. Basically it takes what makes Bluetooth LE great for discover, secures it and then makes it faster and more accurate. So we can see the consumer use cases for UWB, sharing files and finding those tiles we’ve heard so much about. But where this gets very interesting for our space is for data collection and working inside digital twins. You can already see the augmented reality use case here. A sensor has gone bad in a building, I can find it now with millimeter accuracy. But it’s not just what direction it’s how far. UWB uses “time of flight” to pinpoint location (measuring the time of signal to gauge distance), enabling it to know how far away it is. Just knowing a sensor is ahead of you is one thing, but knowing it is 20 feet away, that’s really a game changer.
You can see this through a little known app Apple makes called Indoor Survey. Small side note, back in late 2015 I blogged about Apple’s Indoor Positioning App which ties into all this. Where you really see this use is when you go to the signup page see how data is brought into this app using a standard called Indoor Mapping Data Format. Indoor Mapping Data Format (IMDF) provides a generalized, yet comprehensive data model for any indoor location, creating a basis for orientation, navigation and discovery. IMDF is output as an archive of GeoJSON files. Going to the IMDF Sandbox really shows you what this format is about.
Apple’s IMDF Sandbox
Basically you see a map editor that allows you to really get into how interiors are mapped and used. So Apple iPhone 11 UWB devices can help place themselves more accurately on maps and route users around building interiors. Smart buildings get smarter by the devices talking to each other. Oh and IMDF, Apple says, “For GIS and BIM specialists, there is support for IMDF in many of your favorite tools.“. I will need to spend a bit more time with IMDF but its basically GeoJSON objects so we already know how to use it.
The thing about GPS data collection is it works great outdoors, but inside it is much harder to get accuracy, especially when you need it. With Indoor Survey, devices can collect data much more accurately indoors because they know exactly where they are. If you’ve ever used Apple Maps in an airport and seen how it routes you from gate to gate, you get an idea how this works. But with UWB, you go from foot accuracy to sub centimeter. That’s a big difference.
Now we’re a long way away from UWB being ubiquitous like Bluetooth LE is. Right now as far as I can tell, only Apple has UWB chips in their devices and we don’t know how compatible this all is yet. But you can see how the roadmap is laid out here. UWB, GeoJSON and an iPhone 11. Devices help each other get better location and in turn make working with Digital Twins and data collection so much easier.
-
0:00
Bad Esri Products are Good
sur James Fee GIS BlogI was having drinks the other day with an ex-Esri employee and we were talking about what Esri products I liked to work with. The short list is right below:
- ArcView 3.x
- MapObjects
- ArcIMS
Arc/INFO might be on that list but let’s cap it at three. None of them were products that Esri wanted to keep around. All of them were thrust in the marketplace and then poorly supported. I get the idea that Esri wanted everything on ArcGIS platform (Server being a joke for so many years is proof of this) but being a developer on those platforms was really hard. The transition from Avenue to VB/VBA was particularly brutal. There were books written to help with this transition, but none by Esri.
My trajectory was shaped by these products above being abandoned by Esri. I went another direction because of being burnt by proprietary products that when abandoned cause huge problems. I think you have two choices, either double down or hit the eject button. I’m so glad I ejected…
-
23:13
Download Your Fusion Tables Data
sur James Fee GIS BlogI first wrote about Fusion Tables back in 2010.
Google Fusion Tables – Are you kidding me? These stuff is “teh awesome”. Fusion tables are going to be more “killer” than Google Maps was. Yup, pay attention.
cageyjames“teh awesome”? Seriously, who says that? Well I guess I did and that’s OK. Was it more “killer” than Google Maps, obviously no. It’s not that Fusion Tables was wrong, it is just there are so many alternatives to it that it really doesn’t matter anymore like it did when it first arrived.
Well if you’re like me, you probably have a lot of data in Fusion Tables and Google just sent out an email explaining how to get it out.
If you created many tables over the years, we’ve made it easy to download all your data in one step with a new dedicated Fusion Tables option in Google Takeout. You can save the rows, metadata and geometries of any base tables that you own, and export this data in the following formats: JSON, CSV and KML.
It’s a really nice tool, just tried it myself on some baseball data that I had in there. Google explains the tool as such:
The data for each table is saved to its own “archive”. The data will be saved in a Google Sheet; for datasets beyond the size limits of Sheets, you’ll get a CSV. This archive is stored in a top level folder called “ft-archive” in your Drive.
A Google Maps visualization is automatically created with the archived data. This map preserves many of the original Fusion Tables styling configurations. Any changes you make to the Sheet or CSV will appear in the map visualization.
A listing of all archived tables is stored in a Sheet. This handy Sheet is called “ft-archive-index” and lives within the “ft-archive” folder. The index Sheet summarizes each run of the archive tool and preserves the visualization URLs with encoded styles. Each time you run the archive tool, you will get additional archives based on the current data in your tables along with corresponding new rows in the archive directory.
You have until December 3, 2019 to get your data out. Google Takeout makes it easy which is really nice.
-
17:09
Moving the Home Office
sur James Fee GIS BlogMoving the home office is always interesting, you find so much that you’ve done over the past years and just stuck in a drawer or a shelf. Companies you worked for, RaspberryPis that never were used. Keys to a safety deposit box you don’t recall its location. But that is what makes moving therapeutic, cleaning out the old, unused parts of your life and focusing on the ones that make you happy. Do I need a puppet of Andrew Turner [1] in my desk, nope. But I do need the things that make me happy. So now that I’ve boxed up everything but the work MacBookP Pro, I feel strangely at rest [2] .
-
11:09
CnG Podcast Episode 5 – Expectations
sur James Fee GIS BlogBill Dollins and I took the show on the road to Spatial Networks HQ after dark to record this one. Lots of thoughts on expectations users and developers should have for open source projects as well as the big news that Cesium has spun out on their own. We also talk about machine learning and Google Photos’ ability to automatically create a panorama image.
-
16:09
The GIS Database
sur James Fee GIS BlogI’ve been thinking about GIS data a bit lately, mostly because I’m cleaning off old hard drives I’ve had in my possession to try and consolidate my data (or not lose the data off of old hard drives). Typically GIS data was accessed one of two ways, either from a server through some endpoint or via a local file store. I can’t look at these old ArcGIS Desktop MXDs anymore but I recall most of the work we did was local file store. You know, sitting on the “P drive” and referenced via a file path. We all remember opening up projects and seeing those red exclamation points telling us that data was moved (or the project file was).
It is very easy in retrospect to go back and call yourself batshit crazy for storing data this way (back up hopefully every night on a DLT tape). I mean think about this for a minute, nothing was versioned. We live in this world of git where everything I do (including this blog) is stored in a database where I can track changes and revert if need be. Now I’m not using this post to talk about the need of GeoGig or whatever that project is called these days (I’m not even sure it still exists), but the realization that GIS over the years is such a workgroup discipline.
I worked for AECOM, the largest AEC in the world. We did some amazing enterprise projects but GIS was never one of them. It was a small group of GIS “pros”, “doing” GIS to support some enterprise project that changed the world. Tacked on if you will, and it’s not just AECOM that worked that way. Every organization views GIS this way, like “graphics”. Why is this? Because GIS “pros” have let it be this way.
I’m not trying to come up with a solution here because I don’t think there is one. GIS is just very small minded compared to other professions in the tech space. Even the word “enterprise” has been appropriated to mean something totally different. Just having a web map does not make GIS “enterprise”, in fact all you’re doing is taking workgroup and making it worse. It is easy to pick on Esri (as I did above) but they’re not the big problem. It’s the implementations which make Esri have such terminology. That is, it is the GIS “pros” who cause these problems on themselves. Who is to fault Esri for trying to make a buck?
I have made it my professional career to fix broken GIS systems. People always ask me, “What madness you must see trying to undo broken GIS systems” but the reality is I see some amazing work. Just small minded implementations. It is easy to make fun of ArcObjects or GML but they are just libraries that people use to create tools.
This isn’t a call to arms or a reminder that you’re doing GIS wrong, it’s just thoughts on a plane headed across the country where I’m looking at data that I created as a workgroup project. I’m sure there are people cleaning up my work that I implemented in the past, I can tell you there is some bad choices in that work. Technology has caused many of us to lose being humble. And that results in only one thing, bad choices. In the end this is my reminder to be humble. The good thing is I have no shapefiles anywhere on this laptop. That’s a start.
-
19:09
Mr. Magoo Does GIS
sur James Fee GIS BlogI’m 46. It is weird even typing that. I’ll be 47 later this year which is even weirder. In my mind I think I’m till thirty-something but age is starting to creep up to me. I’ve noticed that I need reading glasses to see my iPhone.
The days of small text on small screens so I don’t have to scroll is over. Out is 11pt and 12pt fonts in my text editors and terminal windows and in comes 14pt. Fixed with fonts such as SF Mono and Roboto Mono seem to handle my eyes better too. Originally I was thinking that the dark mode on many terminal apps and text editors was going to be hard on my eyes but the fonts above on retina screens really pops for me. That said, dropping down to a non-retina monitor I have a very hard time reading things. So the quality of the screen and fonts seem to mean more to me than the color of the screen. Right now this is my environment:
For text editing, mostly I’m using BBEdit. I’ve hacked the SF Mono font so it is available for BBEdit to use and it is set at 14pt. For my theme, I’m using Xcode Dark which attempts to recreate the Xcode dark mode on BBEdit.
But I use VSCode as well. There I’m using Roboto Mono and the Dark+ Material theme. It is different than the look for BBEdit but it works for me in VSCode.
I’ve replaced Terminal.app with Hyper.is which I’m in love with. I use Roboto Mono and the Hyper-Clean theme. Again a little bit different than the above, but it just works. 14pt font as of course.
I think given that I have three different themes going on here is proof that I haven’t settled down on what looks best for me. I think eventually I’ll have a common theme color and go with it for all three products. 14pt font for me works well. It’s big enough that my eyes don’t strain, but small enough that I can fit enough on my screen. I think if Apple releases SF Mono as a system font, rather than a hack, I’ll go with it over Roboto Mono but honestly Roboto Mono is a great font for me too. We’ll just have to see what happens.
Most other apps I use on a regular basis such as Evernote, Safari, Chrome, Slack, I just go with the defaults. Many have a dark mode that mimics the Mac OS X dark mode and that’s fine with me. If Apple were to allow customization of that dark mode I’d probably be happy but we all know that will never happen. The last year has been hard for me with my eyes, it was the first time I felt myself holding my iPhone out at arms length to read it. What I’ve learned is to embrace a larger font and not strain my eyes. Pride is not suffering because you can’t read like you did 20 years ago, it’s having the will to make the choices you need to continue to be successful. For me, the above works.
-
10:09
Cageyjames and Geobabbler Episode 3
sur James Fee GIS BlogI’m behind but we did another podcast last month.
We focused on the FOSS4G-NA 2019 conference and my choice in surge protectors. Yes, the next expisode will be GDAL so do your homework!
-
14:30
Twitter Lists and Tweet Bankruptcy
sur James Fee GIS BlogWith politics and hatred all over social media these days, it’s hard not to be nostagic of the Twitter we all enjoyed between 2008-2012. I look at my twitter feed these days and it isn’t focused. It’s probably just like yours, full of bots, yahoos, idiots and morons. I started looking through who I followed in the past few years and it’s not pretty. I really miss interacting with people on Twitter, rather than just posting memes.
So I thought about declaring tweet bankruptcy. Just delete the account and start fresh. But that’s not helpful. Sure I did it with my Facebook account but let’s be honest that’s just good practice. With Twitter though I don’t want to just blow away all my tweets (looks like over 41,000 of them), but reduce noise. Looking at my follows there are some basic groups:
- Spatial Networks
- Baseball
- College Sports
- Spatial/GIS
- Programming
- Humor
- News
So then all I need to do is put everyone in lists (or multiple lists) and then I can segrate my twitter experience to my needs. I spent the weekend going through every follow I had (over 1,200) and move them into lists. But at the same time I culled my follows. I wanted to reduce it down to 200-400 follows. This way my main feed is what I consider value, but i can still enjoy conversations with people that aren’t follows.
It really has helped me get more value out of twitter. When I open Twitter on my phone I get only those accounts that I feel like are important enough to me that I should always see them. They all come out of those lists above. But then on my computer, I can use TweetDeck to have my lists always availabe and I can follow work related news or anything else with ease. The other nice thing is I can follow/unfollow people without worry that I’ll lose them. They will always be in my lists.
I can’t remember when Twitter had created lists but it has been a very long time. I resisted them because I thought the firehose was th best method for tweets but I’m enjoying this much more because I see tweets, especially in my Spatial/GIS list that I missed before because there was too much noise.
-
16:09
Friday Links
sur James Fee GIS BlogBack in the day I used to always have a Friday link blog post and I’ve noticed I’ve been doing a lot more reading so it just feels right to visit this back.
- Apple owes everyone an apology and it should start with me, specifically – You can’t but own one of the latest Apple MacBook Pros and not hate the keyboard. I’ve been “lucky” enough to experience all three versions of it. The latest is on my new laptop from Spatial Networks which I have to admit feels the best of any of them but I’m just waiting for the “f” key to stop working like it has on all my other ones. I used to enjoy typing on MacBooks but not anymore. The thing is they keep trying to fix broken and not just go back to something that worked.
- Electric scooters have zipped by docked bikes in popularity – Here in Tempe, AZ we get to see all of them. Bird, Uber/JUMP, Lime, Razor and various ones I can’t even tell the brand. Their are like lice on every corner just fallen over and broken. I noted in St. Pete that they didn’t have any scooters and it was surreal walking around on sidewalk without jumping out of the way of some idiot on a scooter. I don’t understand the business model but I hate to say they are here to stay.
- A look at IBM S/360 core memory: In the 1960s, 128 kilobytes weighed 610 pounds – I mean the title says it all. These things were HUGE! X and Y wires. It’s madness but apparently it worked!
- Notre Dame Cathedral will never be the same, but it can be rebuilt – Thanks to all the pictures and Lidar imagery, the Catheral will be rebuilt and be very close to original. But…
While architects have enough detailed information about the cathedral to pull off a technically very precise reconstruction, the craftsmanship is unlikely to be the same. Today, the stone that makes up the cathedral would be cut using machinery, not by hand by small armies of stonemasons as in the 12th century. “Nineteenth-century and 20th-century Gothic buildings always look a little dead, because the stone doesn’t bear the same marks of the mason’s hand,” Murray told Ars Technica.
Still I look forward to watching this happen.
-
22:30
Baseball is Back
sur James Fee GIS BlogNothing makes me happier than when baseball is back. Nothing feels like spring more than baseball. I’ve been reflecting on the San Francisco Giants chances this year and there will be no even year magic. I’ve gotten to the point of acceptance with this team. The infield is basically the same since the last World Series title but their WAR isn’t close to what it was back then. Any dynasty (except maybe those in the NFL) eventually has to pay for decisions made to get those titles. And the Giants are knee deep in this problem. Back in 2016, they sacrificed young players to get a good core to make one more run. Well that team collapsed and here we are in 2019.
So with that in place, I can enjoy this team in whatever they do. No longer worried about each series and how that might result in hosting postseason games or not. In 2 years this team will look nothing like it does and that’s OK. Baseball is back!
-
16:30
Latest Podcast Episode: PostGIS
sur James Fee GIS BlogBill Dollins and I have our latest podcast out today. It is a day late because of some work related stuff but that’s the best news this week for me. This was was a lot of fun, we dive into PostGIS with some suggestions on how to get started and tools we use to get the best value out of everything we do in PostGIS. Please enjoy and rate us on iTunes or Google Play if you have time.
-
1:10
The Scooter Infestation
sur James Fee GIS BlogAs Spatial Networks is in St. Petersburg I’m spending my first week at the company here for orientation and meeting the team. If you’ve never been to St. Pete its a great small city that isn’t that complicated. The one thing I’ve noticed walking around the city is the lack of electric scooters. There are no Bird, Lime, Razor, Uber or whatever latest company is throwing scooters around. In Tempe, there are so many brands that I can’t even keep track of them anymore. But not here in St. Pete. It makes the sidewalks less crowded, you don’t seem scooters knocked over on ever street corner and honestly seeing everyone actually walking around feels relaxing. You just don’t notice it until you don’t see them. I miss those days…