All Posts By

Christian Larsen

For technical softwares: is PC really the fastest?

By | Uncategorized | No Comments


There is a paradigm in technical/scientific computing in mining.  It is so fundamental that it is rarely questioned even though the industry spends large sums of money trying to deal with the consequences of that paradigm.

This post asks the questions: What if technology is now sufficient to turn this paradigm on its head? What are the consequences? How would such a change impact on the design of enterprise solutions involving technical/scientific algorithms and solutions?

First I better name the paradigm.

“Technical/scientific software works best when the application and its data is on a PC and is slow or unusable if you run the same application over a network.” 

This “truth” has required most of mining software providers to write applications for the PC.

This approach has lead to a proliferation of proprietary data formats as each vendor optimises the performance of their application on the PC.

The big problems arise when you want to collaborate within teams, and across disciplines and/or provide governance over the technical data. This has created a fundamental tension in design of enterprise systems for technical applications.  The technical applications tend to want the data on the PC and push the data to the edge of the network while the collaboration/governance demand tend to pull the data towards the centre. A number of companies have developed solutions to manage this tension for example Datamine’s Summit or MineRP’s SpacialDB.

At GlassTerra we have been investigating technologies that will allow companies to deal with big geospatial data on the cloud.  Our stated stretch target was to handle Petabyte scale data through a low bandwidth browser interface. Recently our experiments have led us to a direction that now challenges the fundamental truth of the above mentioned mining’s technical computing paradigm.  We now believe that technology now exists to make the following statement potentially true.

“Technical/scientific software works best when the application and its data is centralised. PCs software typically is at least 1000 times slower than this alternative.” 

When we started GlassTerra early last year, the first technical challenge we took on was the delivery of geospatial data (meshes, voxels, point clouds etc) to a browser needing only a low bandwidth connection.  Last year we achieved this milestone.  We did this using standard graphics packages that had their origins from PC software.  We got it working and it was usable but the user experience was not as good as if the same software and its data was installed on a PC. Nonetheless we were good enough to explore a wide array of applications around publishing and collaboration on geospatial data that were unexplored because data was tethered to the PC.

The technical challenge then in front of us was that the data sets we could handle were similar to those for PCs.  We were being challenged by ever larger datasets and we wanted to find a way of making the performance as good as PCs.  We set our challenge as wanting to be able to work with Petabyte scale geospatial data with a user experience similar to working with PC based applications working with Gigabyte scale data sets.

We explored a number of different methods in order to  achieve this goal but finally settled on parallel computing exploiting the scalable computing infrastructure inherent in AWS, Azure or Softlayer. (You may sometimes see this approach called High Performance Computing or shortened to HPC).

In February 2016, we got our first lab experiments working on mining data and the results rocked me back on my heels.  We took a gold model and ran grade shells on it.  The parallel computing techniques were more than 1000 times faster than standard PC software.

I was shocked. This result challenged my over 30 years’ of experience in mining software. The paradigm of the network is too slow for technical computing is really entrenched. We had to test this at bigger scale.  Fortunately, a large mining company heard about our results and wanted to see if we could get it working at scale.  We engaged with the mining company  and took on the challenge.

The results of our experiment are that we can now work at Terabyte scale data sets containing mixed geospatial data (data sets of a size a PC can’t handle) and run simple queries on the data in fractions of a second.  We think that most algorithms used by mining software could be rewritten to operate using these techniques and obtain similar speed improvements. If so, not only almost any of the current applications being sold to mining companies could work with datasets orders of magnitude larger than they currently can handle but also with faster response time.


Here are my thoughts on some of the implications.

  • This will be strong incentive to go for a standard data format.  Geospatial data with all the attributes needed for mining applications can be hosted on cloud platforms and most scientific functions used by the various mining packages can be made to work faster. Potentially over 1000 times faster.  Data structures optimised for specific use case or task would not be a valid excuse.
  • Technical software will no longer be limited by data set size.  It would be possible for geological modelling, mine design, mineral processing etc to work at regional scale datasets and incorporate big sensor datasets such as lidar, hyperspectral etc.
  • Because the database is structured for scalable parallel processing it is also structured for distributed storage.  The data can be stored in multiple locations but have global query capability.  Mines with limited bandwidth therefore can store large datasets collected at the mine on local servers while the corporation can centralise, and even manage, that data where that adds value.  Technical professionals could operate on the combined datasets regardless of where they are located.

A demo of our above mentioned technology will be available on our website soon, so please come back to our website later to check it out if you are interested in this technology. In the meantime, I would like to hear about your thoughts in this area.

Till next time,


Robots Mining Future

By | Uncategorized | No Comments


For the last few years there has been a quiet revolution going on in agriculture. One that may give clues as to what is about to happen in mining. Robots have arrived and they are changing everything!

Since mechanisation brought on by the industrial revolution, we have seen bigger fields handled by bigger machines in agriculture. It is all about economies of scale – a continuous march to dilute labour. Sound familiar? In the same time period, mines have grown in size and the equipment we use to extract resources have gotten bigger too.

Take a look at some Australian robot technology being developed for agriculture:

Robots are changing the rules of that game. There is no labour to dilute. If the numerator goes to zero it doesn’t matter how big the denominator becomes – it’s still zero.

So what do the rules in mining look like in the robot world? Well instead of big, fast and few, it will be small, slow and many (swarming).

Small allows the automated machine to be highly selective and react to small changes, instead of dealing with averages. This maximises the grade quality and minimises wasted time and energy.

Slow means that the robot can be low cost in terms of upfront capex and ongoing maintenance.

Many is because each unit is far more affordable and we compensate for the slower speed by going parallel. We might see a similar output volume as a few, large machines, but with much higher grade and lower costs, therefore improving profitability.

This brave new future for mining is coming sooner than we think. In fact, we have already seen precursors to the revolution in the form of drones, automated trains and self-driven trucks on mine sites. The real question is, how will the truly enormous amounts of data created to fuel these systems be managed by mere humans? That’s what we’re working on here at GlassTerra.

Till next time,


Learnings from running an online competition to solve a mining problem

By | Uncategorized | No Comments


I’ve been asked on many occasions about what we have learned from running an online competition last year. Today, I finally got around to writing this blog so the learnings are written down and passed on.

For those who have not come across it, late last year, GlassTerra teamed up with Unearthed to run an online competition for Goldfields. The goal of the competition was to find a way of turning photographs of the face in a drive into a grade estimate. For Goldfields, the competition was a chance to solve a small problem which was not economically viable if trying to solve in traditional ways but would drive an incremental productivity gain if solved. For Unearthed, it was a chance to extend their product offering; and for GlassTerra, it was a chance to show the capabilities of our new technology to the world.

The competition delivered a success for everyone, but as with all things you do for the first time, we made mistakes and learned a few things on the way. So here it is:

1. When is an online challenge a good mechanism to procure innovation?

When you are trying to find the how but not the what.

  • The answer is an algorithm.
  • The algorithm should it be found will deliver a short payback period.
  • The real world data upon which the algorithm is to operate already exists.
  • End users want the result right away.
  • It is easy to determine if the algorithm is giving a good answer.

2. What are the must haves in terms of running the competition?

Say it loud, say it simple, say it often.

  • Partner with industry and get them to shout out on their email lists and use publicity if you can.
  • Make the reward exciting – money, reputation, ability to create a new start-up, mastery, bragging rights.
  • Don’t ask to own the IP as this will turn away many potential contestants.
  • Explain the problem “like I’m five” if you want to get more people involved from outside the industry. No one else really understands our jargon.  
  • Show pictures and visuals. The majority of mining problems have spatial component, so show it.
  • When you have a winner, deploy the result in some minimal way to the end users straight away. Capitalise on the high.

It is great to see that Unearthed now has another competition underway and this time it is for BHP Billiton.  You will find GlassTerra there presenting the input data in full 3D.  Good luck to everyone who has a go at it.

Till next time,



NSW Launches View of State Geology using GlassTerra

By | Uncategorized | No Comments

The New South Wales State Government now presents its geology to the world through GlassShowcase.

The purpose of this is to promote the State’s mineral potential in a highly visual and interactive way to potential investors and miners.

The Geological Survey of NSW’s Geospatial data is publicly shared using GlassShowcase’s 3D visualisation platform with various layers which can be explored and downloaded by anyone.

To give this information the broadest possible exposure, casual users can access and understand the data directly in their web browsers, without expensive software or years of training.

This project is part of GlassTerra’s vision to provide everyone in the mining industry with easy access to geospatial data, in any location, at any data size.

Explore the Geological Survey of NSW for yourself at

Till next time,


Announcing 3 new Expert Apps

By | Uncategorized | No Comments

GlassTerra enters into an exciting new year and we are starting to streamline our services with the release of 3 new Expert Apps.

GlassTerra Expert Apps

The three Expert Apps are:

  • GlassShowcase – Show the world your geological models; Governments, mine asset owners and consultants can share their models with online visitors.
  • GlassData – Secure sharing & collaboration for mining professionals working with geospatial data.
  • GlassPlugin – Add GlassTerra’s 3D geospatial data viewing functionality into your own software or web application.

The first part of our last year was about perfecting our basic tech. We wanted to present large amounts of geospatial information in full 3D via the web. Because we knew that a lot of our potential audiences were often in locations with limited bandwidth, so we wanted the tech to work over 3G connections. This way our technology is covered off on mobile devices without killing our customers’ data plan.  By May 2015, we finally got a combination that works.

We then spent the rest of last year getting product market fit; acquiring early adopters and raising funding. In the end, we successfully completed our first few projects, won a government grant and raised some money from investors.

We entered 2016 configuring ourselves to focus on the initial propositions that our customers found valuable. Hence the 3 apps we are launching.

Next on our agenda from now is extending our scope by judiciously adding new apps. But we are most thrilled by the fact that we have started to work on tech that will allow us to host and visualise petabyte scale data sets.  More on this will come another time when we reveal some results from our labs.

For now, please explore our revised capability and have a look at some of the organisations that are already using GlassTerra.

Till next time,