A few days ago, I published an example of of drawing a finite-element grid in Tableau. I thought that it would be just another post that would have very little readership. I figured it would be just another post that gets buried in my archives until my blog goes away. For some funny reason, however, this post created some dialog and has ignited a spark of creativity in my brain that I need to put onto paper. Hence, this blog post was born and now I have to say what I need to say before my brain moves onto something else.
If you like this article and would like to see more of what I write, please subscribe to my blog by taking 5 seconds to enter your email address below. It is free and it motivates me to continue writing, so thanks!
One of my favorite phrases I heard many years ago as a geology student is that “ontogeny recapitulates phylogeny” (ORP). That phrase just sounds cool. As a young man, I remember using that phrase at strategic times just to make it look like I knew something, when in reality I was clueless. I believed back then that this meant that children are likely to be just like their parents.
The theory is much deeper than that and involves embryonic development stages and blah, blah, blah. Now in the biologic field, this theory has largely been discredited. However, recapitulation theories still exist in other fields like language and cognitive development. These theories suggest that as new things develop they carry with them signatures or aspects of their past. Software is a lot like that. Products get built to replace other products and the cycle goes on over time.
As I think of the future for software products, I can see looking backwards that sometimes new products emerge the are enhanced versions of existing products. New products replace old products but they are in many ways built upon the ideas and concepts of other older products.
When Samsung releases a new smartphone, they say it is the “next best thing”. Their new smartphone is built upon the lessons learned from their previous phones. In software, updates are released to enhance the software capability and user experience, but every once in a while an entirely new product emerges that changes the playing field entirely. For me, that product is Tableau software and here are my thoughts of what this product could be in the future.
Reflections of the Past
Looking back through the years, it is amazing to me what we can now do with software and computers. The first finite-element grids I drew were done with pencil and graph paper. I remember designing a grid while traveling through airports in the mid 1980’s and having to tape together sheets of paper to complete the job.
I recently found a hand-drawn 2-D grid I used to solve one of my first groundwater flow problems at a nuclear site. That problem had hundreds of nodes and elements. When I was finished with the grid, I had to manually type in the x and y coordinates and then the element and nodal connections. ASCII editors were used for this work back in the days of MS-DOS, before Windows burst on to the scene. We got the job done but it wasn’t fast and it wasn’t that much fun to work that way.
Fast forward 20 years and I was building 3D finite-element and finite-difference grids with many millions of nodes and elements (or cells) with the help of some great software tools. During those 20 years, I helped develop boundary element methods for the automatic generation of finite-element grids. I developed my own codes for creating site-specific grids with irregular domains like the example shown in the finite-element blog post.
I developed grids and numerical models for everything from helping to design and build locks and dams on the Mississippi river to simulating groundwater flow around the subsurface linear accelerator (linac) of the Spallation Neutron Source in Oak Ridge, TN (Figures 1 – 3). This work would never have been possible without the development of the grid generators. Great software was built to replace a previously manual process and advanced the science and allowed for the development of creative numerical model simulations.
Figure 1 – The Spallation Neutron Source in Oak Ridge, TN.
Detailed finite-element and finite-difference grids were only part of the story. Once you built the model and simulated the groundwater system, you had to have a way to visualize the results. Since software tools didn’t exist for the numerical model codes I was using, I had to write my own and this took a lot of effort.
I developed contouring algorithms for the rapid processing of the model results. I built a general graphical post-processors that could be used for multiple computational models for displaying contours, flow vectors, and time series of simulated variables. At the time, these codes were built with compilers, linkers, and external math and graphical libraries.
Since what I was doing was at the forefront of development, I had to work closely with the software teams that were building the graphical libraries I was using. We had to do optimization work to get things running fast enough because the computers were limited in memory and computational speed. These tools had to be created because no general software tools were able to do what we wanted to do.
We had to have vision and creativity to solve problems by writing custom codes for the models we used. We tried to generalize them to work with a lot of different models but we quickly found out that it was a hard thing to do because each model had its own output data types and formats and these were changing all the time.
Therefore, to have a general software tool like this you were in a constant state of development which would slowly, surely grind you to a halt. We realized that there had to be a better way than writing a tool that had to maintain a connection to the program that created the output for analysis.
As the years ticked by, a few products for graphical analysis of model results began to grow its user base. Tecplot, a graphical processing engine which was born out of the need to visualize computational fluid dynamics results, began to be used for groundwater and surface water models (Figure 4).
Visual-Modflow, Groundwater Vistas, GMS, and other programs were developed over many years to give us the ability to visualize our scientific data. Certain groundwater models gained traction and their accompanying graphical post-processors grew in complexity. For example, the model Modflow-Surfact model emerged as a very capable simulation tool and it was initially compatible with Tecplot and Groundwater-Vistas for graphical processing of model results. Now several years later, it is compatible with other graphical post-processing engines, or Graphical User Interfaces (GUIs) as shown in Figure 5.
The primary problem with this approach is that all these GUI’s have their own work-flow paradigms and require insider knowledge to get them to work well. There are no standards for visual best practices, nor are there standards for how the graphics are produced. Significant learning curves exist for these tools and there isn’t a lot of consistency to how things are done. Since these GUIs are capable of handling 3D and 4D (time) data, you have to work with them frequently to get into a good workflow because the way that they handle the third and fourth dimension varies so much. In other words, these tools are not intuitively obvious to use.
I recently went to use the GMS GUI tool, which I was a technical reviewer for many years ago and worked with it extensively for many years. When I tried to use the tool, I was lost. It took hours for me to do anything useful. This tells me that the design of this software is flawed. There is too much insider knowledge needed and not enough thought went into making the tool easy to use. Eventually this product will have to change or it will be supplanted by another, easier to use tool.
With all that being said, when I first saw Tableau back in 2008, I was stunned by its intuitive design. After years of grinding through the process of writing custom codes and using time-consuming Excel graphics, I was entranced with Tableau. I immediately knew which path my software future would be taking. Hence, I have used Tableau nearly every work day since that time, and many days on the weekends.
Thoughts of a Tableau Future
I wish we could get the Tableau software company to realize that their software framework, with some minor modifications and expansion of graphical offerings, could explode onto the scientific visualization scene. I can clearly see the day when “Tableau Scientific” is released (Figure 6). This product would instantly gain traction because of its intuitive design and flexibility with respect to connecting to data sources. If this tool were to be created, Tableau wouldn’t only be known as a business tool – it would be known as a problem solving tool with a much larger user base than it currently has. Tableau would become the de-facto tool of choice for producing graphical results.
If you think of MS Excel, it initially was created to do financial type calculations and it primarily was thought of as a business tool. By the mid 1980’s, Microsoft saw the dominance that Lotus 123 and Quattro had in the spreadsheet marketplace and they decided that they could build a better product. I remember the day that I decided to switch from Quattro to MS Excel because it was bursting onto the scene with better graphics and overall capabilities. My friend in graduate school was using Excel on a Mac and I was really impressed with what he could do. So when the Windows version came out in 1987 or so, I resisted the change at first but then I realized it was time to switch. As I look back on that decision, it was the right decision. Now some 25 years later, Excel is a standard tool used for everything, in every business and every discipline in a company.
When I first saw Tableau in 2008, I immediately recognized its abilities and started using it 100% for the production of graphics and 80% for computations. Now it is nearly 100% for both graphics and computations because it is painful for me to do graphical operations in Excel. Sure, I still use Excel for a few things, but Tableau has supplanted Excel as my primary analysis platform. Tableau could be like that if Tableau management could expand their vision of its future. The details of how this could happen are stored in my brain and one day I’ll have the energy and determination to write that post, but for now, thanks for reading.