Welcome!

Oracle Authors: Jnan Dash, Michael Bushong, Avi Rosenthal

Blog Feed Post

Big Data and Analysis in the Maritime Battlespace.

By

Yesterday I joined 200 other DoD and Industry big data experts in Charleston, South Carolina at the AFCEA Big Data in the Maritime Battlespace Symposium as we wrestled with how to manage big data in a maritime battlespace environment. As hard as that is with plenty of bandwidth, storage and computing power, the problem becomes infinitely more difficult in the disconnected, intermittent, low bandwidth world of ships at sea. This AFCEA Low Country chapter event was aimed at connecting the government and industry strategists and technicians so they could tackle some of these issues.

One of the more interesting discussions revolved around the idea that before you could begin to assemble the technologies and procedures to analyze the data, you needed to determine exactly who was the consumer and who was the producer. In a Battlespace environment, those roles can change very quickly. Lt Rollie Wicks from the Naval Postgraduate School shared his research on the Naval Tactical Cloud. The overview he presented was a thoughtful analysis of the issues and options available and carefully reviewed the issues between the types of clouds and their tactical uses.

As Navy sensors evolve and more unmanned vehicles dump valuable data into the battle group, we can expect they will create 100 terabytes per day in the next few years. Navy needs big data cloud technologies to provide capability, agility and keep down life cycle costs. In the next few years, the Office of Naval Research (ONR) will be conducting a series of Limited Technology Experiments (LTE’s) to explore how a tactical cloud might support (or impair) data processing at sea. Issues to be resolved will be security of the information, how to make sure the information is available even when the clouds are not, and how well proven big data, cloud solutions will work in the limited bandwidth environments. Most of the government speakers expressed some form of concern that the new technologies are ahead of their existing architectures. Being able to identify that early on is essential to success.

One of the more interesting displays was by Terracotta, who demonstrated their in-memory solution featuring complex event processing on streaming bigdata. In the next few years, these types of cutting edge capabilities will be necessary to manage the above mentioned data deluges.

One of the more interesting discussions was by Chris Biow, public sector CTO for MarkLogic, who always provides fantastic context on modern analytical tools. Chris added valuable insights into providing analytical tools and decision support methods that work in a bandwidth constrained environment

Speaker list included:

  • Tony Orlando – AFCEA President
  • Captain Glover – CO SSC LANT
  • Marv Langston, Langston Associates
  • Terry Simpson, HQ USMC
  • Dr. Howard , DISA
  • Chuck Gassert, DCGS N
  • LCDR Jeff Kenney, NCWDG
  • Sandra L. Smith, USMC
  • Brian Freeman, MITRE
  • Tom Plunkett, ORACLE
  • Kirk Kern, NATAPP
  • Shaun Connolly, Horton Works
  • Basil Decina, NRL
  • John Easton, METOC
  • Michele Weslander Quaid, GOOGLE
  • Jim Wakefield, Teradata
  • Fabien Sanglier, Terracotta
  • Rich Campbell, EMC
  • Ted Malone, Microsoft
  • Chris Blow, Mark Logic
  • Mike MacDonald,OPNAV/TENCAP
  • LCDR Jeff Kinney, NCWDG

 

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley, former CTO of the Defense Intelligence Agency (DIA), is Founder and CTO of Crucial Point LLC, a technology research and advisory firm providing fact based technology reviews in support of venture capital, private equity and emerging technology firms. He has extensive industry experience in intelligence and security and was awarded an intelligence community meritorious achievement award by AFCEA in 2008, and has also been recognized as an Infoworld Top 25 CTO and as one of the most fascinating communicators in Government IT by GovFresh.