At the National Geospatial-Intelligence Agency, we deliver a broad spectrum of services to a wide variety of stakeholders. We analyze geospatial data, and provide associated intelligence for policymakers and military commanders. We also help solve economic and energy security problems, and play an important role in disaster relief and responding to humanitarian issues around the world.
In an era when big data provided by a multitude of traditional sources, social media, and the Internet of Things is ever-expanding, continuing to meet these obligations meant transforming the way NGA does business. We had to find new ways to grow our workforce and to stay abreast of technological developments that would streamline our operations and provide the best possible consumer experience.
Providing exemplary service means NGA has to use lot of GIS software: in fact, NGA is probably the largest user of GIS-related software. In order to offer the best possible services, we have to use the most effective balance of open source and commercial software, while monitoring cutting-edge geospatial technology and data sources emerging from Silicon Valley.
Finding this balance was more difficult than it sounds. As recently as a decade ago, almost all the software NGA used was proprietary, and our primary sources of data were classified—even though there was a massive amount of unclassified, commercial, and open data to use.
Even though we were already using some of the best commercial software available—built by some of the biggest and most experienced companies—it quickly became apparent that the world was moving at a much faster pace: one measured in GitHub commits by the thousands and millions each week.
The practice of waiting years for new versions of software—and then waiting additional time for security approvals—was leaving NGA behind. This was despite the fact that, over the last decade, open source software was yielding more technological advancements than ever before.