
Listen Here
I recently met David Hansen, the CEO of GEO Jobe, while coordinating for this podcast episode. I didn’t know much about him or the company at the time, but I knew they had a long-standing influence in the GIS field that I suspected would translate into an interesting conversation. I was excited that he was willing to be a guest and after speaking with David for a few times I am thrilled to be able to put a spotlight on him, his leadership practices, and the company as a whole. I learned so much during my conversations with David and I truly hope that his messaging resonates to our listeners. He is inspirational and sets a standard for how a good company culture can and should be established. David has overcome many challenges during his life, however those experiences have left him with an insatiable desire to give back to others. He especially loves providing opportunities to support and develop people from his local coastal communities of Gulfport and Biloxi as well as the remainder of the state of Mississippi. David is proud to have been born and raised in Mississippi and he was so excited to discuss the cultural aspects of the region. David truly taps into the geospatial richness of his area by bringing out the best in others.
GEO Jobe is a Platinum Esri Partner and has been an instrumental player in the GIS community since being founded in 1999. They have a global footprint with clients in business sectors ranging from Fortune 500 companies to local governments, utilities, and municipalities. As a consulting firm, GEO Jobe focuses on a plethora of geospatial solutions for their clients including custom software development, ArcGIS Online and Enterprise extensions, and their ABC’s of GIS (admin tools, backup and cleaning processes). They offer just about anything a customer could need in the geospatial space and they happily partner with other compatible partner companies to provide the best possible services for the geospatial community. David says that he and his team at GEO Jobe try to offer a caring and customer-centric approach to the business.
As I sit down to write this blog having recorded the episode only a few hours earlier, I find myself feeling inspired and motivated to help others more and to do challenging things. David emphasized the importance of believing in others and believing in ourselves. Circumventing the negative messaging that so often plagues our society, turning the “can’t” into “can” and just digging in to learn something new can oftentimes lead us down incredible new paths and open doors we never could have imagined. I think the messaging that stuck with me the most from my time with David centered around people- being a good person, taking care of those around you, and doing good things for others. It is amazing how much good one person, driven by an innate need to make the world around him a better place, can do for his community.
Hello, I use the notebook in ArcGIS Pro 2.6, and update python api to 1.8.2. The first two step is fine, and create a Data Store (Folder) in my Portal and Server. But the third step always come back with ERROR 999999, with the full error as below: Exception Traceback (most recent call last) In [3]: Line 2: desc = ds.describe(item=ds_item.id, server_id=server_id, path=’/’, store_type=’datastore’).result() File C:\arcpy\arcgispro-py3-clone26\lib\site-packages\arcgis\gis\_impl\_jb.py, in result: Line 202: res = self._future.result() File C:\arcpy\arcgispro-py3-clone26\lib\concurrent\futures\_base.py, in result: Line 432: return self.__get_result() File C:\arcpy\arcgispro-py3-clone26\lib\concurrent\futures\_base.py, in __get_result: Line 384: raise self._exception File C:\arcpy\arcgispro-py3-clone26\lib\concurrent\futures\thread.py, in run: Line 56: result = self.fn(*self.args, **self.kwargs) File… Read more »
Hi Karl,
Please ensure the data store path is the folder where the eslpk/i3srest content resides. Do not provide path to eslpk in data store. Please feel free too reach out to me via “connect” option.
Best wishes,
Garima
Hello, In Step 3, When
ds_name = ‘qingxie.eslpk’
for child in desc[‘result’][‘children’]:
if child[‘name’].lower().find(ds_name.lower()) > -1:
break
it turns out an error:
KeyError Traceback (most recent call last)
in
1 ds_name = ‘qingxie.eslpk’
—-> 2 for child in desc[‘result’][‘children’]:
3 if child[‘name’].lower().find(ds_name.lower()) > -1:
4 break
KeyError: ‘result’
my desc[‘definition’] is
{‘operation’: ‘describe’,
‘datastoreId’: ‘f288a585b3224e6091c8707e31c5b764’,
‘serverId’: ‘EOI1WUNp6w8OfMu7’,
‘path’: ‘/’,
‘type’: ‘datastore’}
can you help me?
Hello Kang,
You may choose to skip step 3, if you know the path to your scene content folder inside the data store.
Best Wishes,
Garima
Starting with this release, for the scene service, the i3s content can be stored in a tile cache store in a mode of primary and standby machines or a cluster containing multiple machines, or stored in a folder stores, or stored in a cloud stores. Then, what should I choose when I publish, the tile cache store, the folder store or the cloud store? Do you have the advantages and disadvantages of them? And, for the tile cache store, it says below: https://enterprise.arcgis.com/en/portal/latest/administer/windows/whats-new-data-store.htm However, accessing scene data in this mode can be slow when you use fewer than five machines… Read more »
Sai, ArcGIS Datastore based scene tile cache store is used for: 1. hosted scene layers with associated feature services and 2. hosted scene layers using scene layer package. It supports the following options: • Single machine or primary/standby mode with 2 machines for maximum throughput: Recommended if you think, scene caches content can be accommodated on a single machine • Cluster mode (of odd number of machines): Recommended if you anticipate your scene caches content to grow. The additional machines in cluster help re-balance your scene caches to scale out across multiple machines Please see the following link for detailed… Read more »
Great new option!
After i3Converter explodes the SLPK in the ESLPK shared folder, publishing lasts just a few seconds!
And It gives even more control over what is being published than default DataStore publishing process.
I succesfully published using an Administrative account, but I found an issue using a Publisher account: the python script ends in Exception “Error Code: 400”
The Lucky part is that publishing process seems to end correctly and the Service and Item are Up and Running!
Is that a Python API issue only ? Has anyone else experienced this behavior?
Hello Alessandro,
This seems to be a bug in ArcGIS API for python. We are working to resolve it in the next release. In the meantime the only workaround would be to use administrative credentials to describe the contents of the datastore or to use the REST endpoint of the sharing API described at https://developers.arcgis.com/rest/users-groups-and-items/describe-datastore.htm
We are working to provide a user interface for this workflow in the upcoming release of ArcGIS Enterprise.
Best Wishes,
Garima