Episode 76
The Future of Government Technology: FedRAMP, AI and Compliance in Focus with Ross Nodurft
December 6th, 2023
41 mins 53 secs
About this Episode
As technology rapidly innovates, it is essential we talk about technology policy. What better way to get in the know than to have an expert break it down for us? Meet Ross Nodurft, the Executive Director of the Alliance for Digital Innovation. Ross dives in, explaining the evolution of FedRAMP controls and the recent, giant, AI Executive Order (EO) from the White House. Listen in to find out what this EO means for the government, the industry and the workforce as the U.S. attempts to implement policy ahead of AI innovation.
Key Topics
- 04:25 Increasing security controls for cloud migration
- 07:51 Discussion about customer feedback and cloud migration.
- 12:17 Encouraging commercial solutions into federal government securely.
- 15:39 Artificial intelligence shaping policy for future technology.
- 16:54 AI EO covers critical infrastructure, AI, data, immigration.
- 22:34 Guidance on AI impact assessment and testing.
- 27:02 AI tools adoption must not be delayed.
- 30:03 Ensure AI technologies have fail-safe mechanisms.
- 32:08 Concern over rapid pace of technological advances.
- 34:29 AI and technology advancing, policy aims control.
- 39:37 Fascinating book on technology and chip history.
The Future of Government Technology: Shifting to FedRAMP High and Accelerating Cloud Adoption
Shift from FedRAMP Moderate to High for Sensitive Workloads
When FedRAMP was established over a decade ago, the focus was on managing the accreditation of emerging cloud infrastructure providers to support the initial migration of workloads. The baseline standard was FedRAMP Moderate, which addressed a "good amount" of security controls for less risky systems. However, Ross explains that increasing volumes of more sensitive workloads have moved to the cloud over time - including mission-critical systems and personal data. Consequently, agencies want to step up from moderate to the more stringent requirements of FedRAMP High to protect higher-risk systems. This includes only allowing High-cloud services to interact with other High-cloud applications.
The Evolution of Cloud Computing: "So right now, we're at the point where people are existing in thin clients that have access to targeted applications, but the back end compute power is kept somewhere else. It's just a completely different world that we're in architecturally." — Ross Nodurft
The Future of Government Technology: Streamlining FedRAMP for the SaaS-Powered Enterprise
According to Ross, the COVID-19 pandemic massively accelerated enterprise cloud adoption and consumption of SaaS applications. With the abrupt shift to remote work, organizations rapidly deployed commercial solutions to meet new demands. In the federal government, this hastened the transition from earlier focus on cloud platforms to widespread use of SaaS. Ross argues that FedRAMP has not evolved at pace to address the volume and type of SaaS solutions now prevalent across agencies. There is a need to streamline authorization pathways attuned to this expanding ecosystem of applications relying on standardized baseline security controls.
High-level Security Controls for Sensitive Data in the Cloud
Addressing Data Related to Students and Constituents
Ross states that as agencies move more sensitive workloads to the cloud, they are stepping up security controls from FedRAMP Moderate to FedRAMP High. Sensitive data includes things like personal HR data or data that could impact markets, as with some of the work USDA does. Willie gives the example of the Department of Education or Federal Student Aid, which may have sensitive data on students that could warrant higher security controls when moved to the cloud.
Ross confirms that is absolutely the case - the trend is for agencies to increase security as they shift more sensitive systems and data to the cloud. Especially with remote work enabled by the pandemic. So agencies with data related to students, constituents, healthcare, financial transactions etc. are deciding to utilize FedRAMP High or tailor Moderate with additional controls when migrating such workloads to ensure proper security and rights protections.
The Future of Government Technology: Navigating the Tradeoffs Between Cloud Innovation and Data Security
As Ross explains, FedRAMP High means you can only interact with other cloud applications that are also FedRAMP High. So there is segmentation occurring with more sensitive data and workloads being isolated via stricter security controls. However, he notes it is not a "bull rush" to FedRAMP High. Rather agencies are steadily moving in cases where the sensitivity of the data warrants it.
Willie then asks about the costs associated with these stricter cloud security authorizations, given even Moderate is expensive. Ross explains there are currently policy discussions underway about making FedRAMP more streamlined and cost-effective so that innovative commercial solutions can still sell to the government without having to completely re-architect their offerings just for these processes. The goal is balancing the accessibility of cloud solutions with appropriate security based on data sensitivity.
Modernizing Federal Government IT: "We need to stop requiring companies to have their own completely separate over architected environment. We want commercial entities to sell commercially built and designed solutions into the federal government." — Ross Nodurft
Laying the Groundwork: The AI Executive Order and the Future of Government Technology
Robust Framework for Future Policy and Legal Development
Ross states that the AI Executive Order is the biggest and most robust executive order he has seen. He explains that it attempts to get ahead of AI technology development by establishing a framework for future policy and legal development related to AI. Ross elaborates that there will need to be additional regulatory and legal work done, and the order aims to "wrap its arms around" AI enough to build further policy on the initial framework provided.
According to Ross, the order covers a wide range of topics including AI in critical infrastructure, generative AI, immigration reform to support the AI workforce, and government use of AI. He mentions the order addresses critical infrastructure like pipelines, hospitals, transportation systems and more. It also covers immigration policy changes needed to ensure the U.S. has the talent to advance AI. Additionally, it focuses heavily on government consumption and deployment of AI.
Mapping the Future of Government Technology
Navigating the Future of Government Technology
The AI executive order tasks the Office of Management and Budget (OMB) with developing guidance for federal agencies on the safe and secure adoption of AI. Specifically, Ross states that the order directs the Federal CIO and other administration officials to establish rules that allow government consumption of AI in a way that protects safety and rights. Before writing this guidance, the order specifies that OMB must consider the impacts of AI on safety-critical infrastructure as well as rights like privacy and fairness.
Ross explains that OMB recently released draft guidance for public comment. He says this draft guidance contains several key components. First, it establishes AI governance requirements, directing every major federal agency to appoint a Chief AI Officer and create an AI council with agency leadership that will oversee adoption. Second, it mandates that agencies take inventory of existing AI use and develop plans detailing how they intend to utilize AI going forward.
Requirements for Agencies to Appoint a Chief AI Officer
According to Ross, a primary governance requirement in the OMB draft guidance is that all major agencies assign a Chief AI Officer to spearhead their efforts. Additionally, he notes that the guidance orders agencies to construct AI councils with membership spanning functions like IT, finance, HR and acquisition. Ross specifies that these councils will be led by the Deputy Secretary and Chief AI Officer of each department.
The Uncertain Future of Government Technology
Collaboration, Prioritization of Assessments, Compliance, Monitoring and Validation
Ross highlights the need for collaboration between industry and agencies to address issues like prioritization, timing, specifics of compliance, attestation and who pays for and validates assessments. The order pushes the use of AI but lacks specifics that could slow adoption of widely-used technologies with AI. Ross notes this could introduce friction, slowing productive technologies when faster digital services are demanded. Better defining compliance pathways is needed to avoid nervousness using AI.
AI Ethics and Regulation: "You've got to run as close to live testing as possible, you've got to have human people factored into the decision-making engines." — Ross Nodurft
While embracing AI, the order does not detail how to facilitate adoption. Ross says this could cause confusion across agencies. His trade association ADI sees the need to add specifics around governance mechanisms to avoid inconsistencies. The lack of clarity risks friction and slowing AI incorporation, which Ross believes is imperative.
Balancing Innovation and Responsibility in Emerging Technologies
Demand for a Digital Environment and the Importance of Observability
Ross states that there is a quick move towards a digital environment across all services, driven by demand from millennials, Gen X and Gen Z. He emphasizes that everything needs to have an app or digital access now to engage users. Ross then highlights how Dynatrace provides important observability of these new cloud-based architectures, allowing agencies to understand usage, interactions and performance. He argues this is essential to properly managing digital services.
Ross worries that the new AI executive order guidance lacks specifics around compliance, which risks creating friction in adopting widely-used technologies like Dynatrace that have AI components. He states there is uncertainty whether tools like Dynatrace must be inventoried and assessed under the new policy. If so, there are many open questions around prioritization, timing, specific compliance activities, and who pays associated costs. Ross emphasizes that this uncertainty could hinder cloud adoption without more clarity.
Responsibility and Control Over the Use of AI Technology
Ross stresses that while AI technology enables incredible things, we have full control and responsibility over its uses. He states we must consider processes and safeguards that provide oversight and allow intervention over AI systems. Ross argues we cannot afford to deploy AI blindly, but highlights it is in our power to leverage these technologies to benefit humanity with appropriate guardrails.
Shaping the Future of Government Technology
The Future of Government Technology and Managing Change for Emerging Fields
Ross asserts today there is greater intention around anticipating risks from emerging technology compared to past eras. He advocates for building off switches and review processes that allow understanding and course correction around new innovations like AI. Ross states this considered approach is essential for nanotechnology, quantum computing and other exponentially advancing fields.
The Influence of Artificial Intelligence in Policy and Legal Development: "But artificial intelligence is now more than ever being built into everything that we do technologically." — Ross Nodurft
Ross disputes the concern that AI will replace jobs, arguing instead it will shift skills required by humans. He provides examples of comparable historical technology shifts requiring new expertise, like transitioning from horses to locomotives. Ross states AI moves job responsibilities in different directions rather than eliminating careers, necessitating learning new tools and approaches.
Establishing Processes and Organizational Structures for the Future of Government Technology
Ross highlights how the AI executive order establishes agency governance bodies to oversee adoption. He details required personnel like Chief AI Officers that must review and approve AI use. Ross states these processes aim to identify risks in using innovations like AI while still encouraging adoption. He argues this organizational oversight is a new paradigm essential for emerging technologies.
About Our Guest
Ross Nodurft is the Executive Director of the Alliance for Digital Innovation (ADI), a coalition of technology companies focused on bringing commercial, cloud-based solutions to the public sector. ADI focuses on promoting policies that enable IT modernization, cybersecurity, smarter acquisition and workforce development. Prior to joining ADI, Ross spent several years working with industry partners on technology and cybersecurity policy and several years in government, both in the executive and legislative branches, including Chief of the Office of Management and Budgets cyber team in the White House.
Episode Links
- Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence
- FedRamp
- Turkey Gumbo Recipe
- Chip War by Chris Miller