Tuesday, December 9, 2008

Models and Simulations as Data

A colleague, Anthony Watkins, and I recently conducted a feasbility assessment of ETL-V applied to a DoD tactical simulation called JCATS:  Joint Conflict and Tactical Simulation.  JCATS was created by Lawrence Livermore National Labs in 1997 and has been supported in some fashion or another ever since.  JCATS is not unlike other military sims I've encountered over the years; OTB, TDF.  In fact, JCATS is generally not unlike all other models and sims with which I have worked in the past 11 years - aerospace apps, power/energy sims, custom apps.

I AM NOT A MODELING AND SIMULATION EXPERT.  

I took a class in graduate school, did pretty well.  But I studied Computer Science and Applications.  That's my slant.

When I reflect on my scant 11 years of experience designing, developing, implementing, and integrating business processes and software - a fair amount of it in the M&S domain - I come to one conclusion.

Interoperability stinks.

A bit of personal history on why that I think that - three words:  separation of concerns.  As in, we haven't maintained it.  I observed the problem in my first job at ODU Research FoundationCenter for Coastal Physical Oceanography:  oceanographers were writing code...bad code.  Eventually the computer scientists got involved, but all we did was write occasionally elegant code wrapped tightly around a domain application.

Then, in September of 2001, I found Object Serialization, Reflection, and XML all at the same time (XML, casually, in '99).  My colleagues and I were using these to create 3D visualization, behavior, and human-computer interaction for the Web.  With the introduction of XML-RPC we were communicating with "back end" data services to drive client-side 3D.  The 3D applications were entirely dynamic without compilation,  the same way all other content was being delivered over HTTP to a standard browser.  It was cooler than dirt.  

Along came Web Services, circa 2003-4 for us. Finally, everything was starting to click.

That brings me back to models and simulations as data and where I am in 2009.  

Simulations aren't applications, they're data sources.

I've written previously about how we use enterprise architecture models as data sources for analytics.  It's basically the same thing for sim integration.  There is one additional component.  Simulations often stream data in real time.  The transactional model of the Web and relational databases is not always sufficient.  Regardless, the methodology is not dependent on transfer rate or protocol.  The methodology is based on separation of concerns.  The difference between yesterday and today is that we realized how to take what were doing above the level of a single application.

Most of the applications with which I am experienced are heavy...very heavy.  Whether we are talking about footprint, documentation, training, number of participants, facilities, or duration most simulation apps require a lot of it.  Technology insertion and training are costly.  Learning curves are steep.  Exposure and knowledge is limited to very small groups.

It doesn't have to be that way, either.

[Update:  12/10/08 - Well, maybe it does.  I appreciate that M&S is hard and hard takes time.  The point I really want to make is that we can and should reuse the data generated by M&S to increase their ROI and overall value of their insights as information assets.  We do this by decoupling their data so it can be more easily integrated with other things - things people who aren't modelers or enterprise architects care about.]

If I could wave a magic wand, I'd strip every simulation down to it's algorithms and databases.  To be sure, there are some sweet front-ends out there, but they aren't maintaining separation of concerns.

JCATS, for example, is a powerful tactical simulation.  There are good smarts in there.  But JCATS has a limited graphical user interface (2D) and is strictly designed for tactical operational scenarios on maps.  While the designers of JCATS may have thought about 3D and some statistical diagnostics, they certainly did not, nor could they have anticipated or accomodated all the many ways we can use the output of JCATS simulations.

The good news is that JCATS saves huge DAT files, i.e., data files, i.e., JCATS data is portable.  JCATS produces ordinary delimited text files (comma-separated) and puts similar data on the wire in real time (either TCP/IP or UDP, I think).

From here it's easy:
















All of these interactive views are transformations of JCATS data using ETL-V



2 comments:

Anonymous said...

Thought-provoking. I "think" that I've been thinking this way all along (the supremacy of the data over the "app-ness") but it's better to have it stated openly and definitively.

Lorem Ipsum said...

Hi Kevin
I found a new method for creating maps for JCATS and I solved the problem with 2D maps, I informed about this, USJFCOM and other simulation centers, but it's amazing, they don't want to take into account this thing, they don't want answer to my emails. I put a few samples and a few words in my blog page, http://ugabriel.blogspot.com/ , please watch this and tell me your oppinion.