Unifying Modern Technology With Data and Computing

Unifying Modern Technology With Data and Computing

The Defense Information Systems Agency’s new Hosting and Compute Center is pushing to advance innovation, provide modern-day “products” to warfighters, and ultimately, a global fabric of computer power and data storage. With a 2,000-person workforce, the Hosting and Compute Center has a heavy agenda, including managing the U.S. Defense Department’s large-scale commercial cloud endeavor, the Joint Warfighting Cloud Capability; overseeing the department’s data centers; advancing agile computer development; and providing the hosting and compute for Joint All-Domain Command and Control, among other measures.

“The mission of the HaCC is to provide the warfighter with best-value hosting and compute solutions to achieve mission success,” explains Sharon Woods, director of the Hosting and Compute Center, or the HaCC. “Data is foundational to the mission of national defense. Doing interesting things with data, whether it is at the edge or whether it is here in the continental Unites States, you have to have the underlying foundation of the basic hosting and compute for that to work. And so, the mission of the HaCC is to create the global fabric that is multilayered, that is resilient, that works in completely disconnected environments, as well as environments that have persistent, predictable communications. Our job is to put that in place so that our mission partners in the department can do really important, interesting things with this data that we have.”

Created last October, the HaCC is a conglomeration of the Defense Information Systems Agency’s (DISA’s) former Services Enterprise (SE) Directorate Front Office, the SE Cloud Services Division and the SE Ecosystem. In addition, the Defense Department transferred over its Cloud Computing Program Office to DISA, placing the organization in the center. Evolved from those divisions, the HaCC now has three centers: the Office of Operations Support, the Office of Product Management and the Office of Compute Operations.

“We’ve done a lot in a really short period of time,” the director states. “One of the things culturally that we are trying to embrace with HaCC is that push for agile methodology with cross-organizational, cross-functional teams. And even though we’re a 2,000-person workforce, a lot of that mentality really is making it into the organization at all levels.”

In providing hosting, compute and data storage resources, the director notes that the HaCC is coming in with the mindset of a contested communications environment. “The approach that the HaCC is taking is that we really need to start with our most austere conditions, the tactical edge, the U.S. combatant commands, like INDOPACOM [the U.S. Indo-Pacific Command], and really think through how the hosting and compute technology works there, and then build back to the more predictable stable environments,” Woods says. “Because if we start with the very predictable, stable environments and we don’t have the warfighter requirements front and center, we may never get there. And with the current near-peer adversarial threats, there is no time to realize you’re not going to get there.”

The HaCC is responsible for DISA’s 11 core data centers, five of which are in the continental United States, six of which are outside of the continental United States (OCONUS) and all of which are increasing in importance, Woods notes. The center is working to meet increasing demand, especially in this Indo-Pacific region.

“The demand for OCONUS hosting and compute increases by the hour,” Woods offers. “There is a lot of demand from both the military services and the U.S. combatant command for more readily accessible, high-quality hosting and compute. The HaCC is tackling that in a few ways, and the data centers are key to that. We have a data center in Yokota, Japan, and we’re looking at modernizing it.

That’s a new data center for us, and we need to bring it up to the same industry standards as our other data centers. That is going to be a key linchpin because of its proximity to some of our near-peer adversaries. The other piece of it is layering on-premise cloud in all our OCONUS data centers or whether it’s in Hawaii or Germany or Yokota. On-premise cloud is critical because of the unique mission of the Department of Defense. That differentiates us a little bit from industry because our mission is global and there are data sovereignty rules that U.S. data must be hosted on U.S. soil. And the on-premise cloud capability is closer to the point of collection so that we’re not a single point of failure and only relying on the traditional transport networks. We can still operate mission even if all the cables get cut.

The need is increasing, and frankly, we’re just trying to move as quickly as we can to match the demand signal.”

Naturally, the decision to end milCloud 2.0—the agency’s second-generation suite of on-premise cloud-based services, which evolved from its first effort begun in 2013—will affect the existing on-premise cloud resource users, but Woods assures users that the HaCC team will meet the growing demand while working to help the transition to other cloud platforms—such as commercial cloud, the HaCC’s Stratus on-premise cloud or other platforms. “We looked at the requirement for on-premise cloud and whether our current offerings were best value to meet that requirement, and ultimately, we did make a choice to allow the milCloud 2.0 contracts to sunset in June,” she clarifies.

Meanwhile, the HaCC is executing the department’s largest commercial cloud venture to date, the Joint Warfighting Cloud Capability (JWCC). “The JWCC program office is in the HaCC, and we are responsible for all of it,” Woods notes. The JWCC cloud service provider contracts, which are expected to be awarded this month, will supply warfighters with globally accessible cloud services that include centralized management and distributed control, ease of use, commercial parity, elastic infrastructure, advanced data analytics, fortified security, integrated cross-domain solutions, advanced data analytics and tactical edge device connection. The JWCC cloud platforms will also have to support the Joint All-Domain Command and Control (JADC2), artificial intelligence (AI) and data acceleration.

The HaCC anticipates awarding at least two IDIQ contracts in the JWCC effort. At the time of the November down-select to Amazon Web Services Inc. (AWS), Google LLC, Microsoft Corporation and Oracle Corporation, only AWS and Microsoft “appeared to be capable of meeting all of the Defense Department’s requirements”—such as providing cloud services at all national security classification levels. However, the HaCC could issue contract awards to the other two cloud service providers if they demonstrate that capability, according to the solicitation. Each IDIQ contract will have specific task orders and a base performance period of 36-months with two 12-month options.

“JWCC is a massive step forward for the department,” Woods stresses. “Strategically, having commercial cloud with multiple vendors on a global scale at all classification levels where it can work both in connected and disconnected environments, out to the tactical edge, is so critical for that to be in place in order to be a key component of the global fabric. Whether it is JADC2, whether it is just a collection of data in theater and wanting to process that at the point of collection, whether it is business systems that we use here, all of it means commercial cloud needs to be available so that mission partners can make choices about whether commercial cloud with any particular vendor makes the most sense for their mission.”

The JWCC cloud providers are required to supply at least three data centers at each classification level at least 150 miles apart. In addition, the companies must have a presence on all continents except Antarctica, provide at least 40 gigabits per second to pair with the government at the company’s provided global network locations, and if the military adds new cloud locations, the companies must ramp up associated service within 12 months notification.

The director now expects a decision to be made about the JWCC contract awards by December 1. 

Mainframe computers also remain an important and economical solution for the HaCC, especially given that type of system’s “powerful ability to store and process massive amounts of information,” Woods shares. And because mainframes are able to process a lot of information quickly, the HaCC can apply solutions such as blockchain capabilities or special encryption applications. “You can have a hybrid cloud environment that consists of a mainframe and commercial cloud, and that is a very modern use case,” she states. “We are taking advantage of all those options so that we can meet requirements in the best value way.”

In addition, the HaCC director identifies the Container-as-a-Service (CaaS) effort as one of the center’s early successes. The CaaS tool is a hybrid integration of several different technologies and is considered one of the HaCC’s first products.

The CaaS platform environment provides code packages for applications to run software reliably in any environment and allows users the ability to lift and shift applications from one environment to another, ultimately driving more capabilities, automation and self-service. Since November, the HaCC has worked to create a minimum viable product of this web server container hosting platform and is exploring two CaaS containerization use case application prototypes to further validate the concept, Woods offers. She expects the CaaS platform to increase software portability and provide faster access to container platforms. In addition, the HaCC will provide direct support to CaaS users so they can focus primarily on application development, and because containers are smaller—megabytes instead of gigabytes—the CaaS tools will enable a single operating system kernel to start up in a few seconds versus several minutes needed in starting up a virtual machine, according to the HaCC.

“One of the things that is so exciting about that CaaS project is that it exemplifies the unification of modern technology and the data center hosting and compute,” Woods offers. “Modern data centers are critical to be able to execute global operations so that you have jumping-off points across the world. The open-shift containers are a hyper modern technology, and we are now running them in the data center. It really shows the nexus of cloud technology and data center technology and how those are not in competition with one another. We can offer hosting and compute, and that’s an example of how.”

The HaCC has already tied the developed container platform into existing application centric infrastructure and by the second quarter of fiscal year 2022 will complete end-to-end testing and deploy it to compute operations and DISA’s web server and mission partner server applications, Woods states.

“We went from ideation, just thinking of the idea, to delivering the minimum viable product in less than six months,” she says. “That’s lightspeed for the Department of Defense when it comes to technology. It’s an example of the HaCC and where we are coming together with different cultures. We’re really integrating our identity, and we are already moving out on modern technology and prototypes using agile methodology.”

Infrastructure-as-code (IAC), another venture at the center, is aimed at providing pre-configured, pre-authorized computer code baselines, the director says. The HaCC has been conducting IAC pilot prototypes to expand the use base to more mission partners. “Now that we have this broader HACC center and IAC falls under the HACC umbrella, we can really commoditize that and start offering it at scale rather than anecdotally,” she offers.

The center did announce on March 31 that DISA’s Risk Management Executive had approved a 3-year authorization for AWS to operate its DoD Cloud Infrastructure as Code (IaC). “The IaC for AWS baseline is a collection of templates to build out underlying cloud environments leveraging the latest managed services from AWS, including Elastic, Kubernetes Service, Sagemaker, Aurora, and others,” HaCC anounced. “These templates help Mission Owners rapidly adopt cloud and focus on what really matters–their applications and data–not the management and continued maintenance of underlying infrastructure. The DoD Cloud IaC baselines are a cost-effective way to adopt cloud capabilities without the significant upfront investment, of time and other resources.”

Furthermore, the CaaS and IAC platforms exemplify how the HaCC wants to position itself as a provider of products akin to the commercial world, driving technology advances as it supplies more modern compute and storage solutions for warfighters, Woods emphasizes. “[It] really starts looking at the full life cycle of innovation to commoditization and deployment at scale,” she says.

Lastly, the director adds that the demand for its products is not just for Earth-based capabilities, with the advent of the U.S. Space Command beginning in August of 2019. “The tactical edge is not just terrestrial,” she acknowledges. “It’s being able to unify and deliver hosting and compute that is both on-premise, commercial cloud and all the different variants in between. The intent here is to lay down a global fabric that takes advantage of all the different ways that we can achieve that.”