Sign In

 

 

Increasing Audit Readinesshttps://www.dau.mil/library/defense-atl/Lists/Blog/DispForm.aspx?ID=93Increasing Audit Readiness2018-10-05T16:00:00Zhttps://wwwad.dauext.dau.mil/library/defense-atl/PublishingImages/DefAcqSept-Oct18_banner4.jpg, https://www.dau.mil/library/defense-atl/PublishingImages/DefAcqSept-Oct18_banner4.jpg https://wwwad.dauext.dau.mil/library/defense-atl/PublishingImages/DefAcqSept-Oct18_banner4.jpg<div class="ExternalClass0D7BA21B50A74156B2560D27DA1F0B53">A key component of audit readiness is investment in human capital through work-­force training. There were lessons learned from the observations and experiences of a Naval Air Systems Command (NAVAIR) National Functional Training Lead (NFTL) who developed and managed related training both during the conversion of Operating Materials and Supplies (OM&S) from legacy material management systems to an approved Accountable Property System of Record (APSR) and into sustainment. Those lessons are examples of best business practices that can be leveraged by other Department of Defense (DoD) components working toward audit readiness.<br> <br> NAVAIR’s best practices include the creation of role-based curriculums, use of gap analysis, compilation of student reference materials and collaboration with other subject-matter experts (SMEs). Both online and offline resources were developed to provide support for and deliver ongoing support to end users after training. <h3>Requirement for Audit Readiness</h3> The Secretary of Defense issued a memo titled “Improving Financial Information and Achieving Audit Readiness” in response to the National Defense Authorization Act (NDAA) of 2010, Section 1003 that said auditable financial statements were necessary to facilitate decision making and to ensure NDAA compliance while informing the public that DoD is a good steward of taxpayer dollars. NAVAIR designated Navy Enterprise Resource Planning (ERP) as the primary APSR for management of NAVAIR’s OM&S in October 2014 and directed program managers and accountable property officers (APOs) to ensure compliance. Senior Leadership then stood up the Audit Ready Inventory Team (ARIT) to facilitate and standardize APSR conversion throughout the Command. NFTLs supporting ARIT evaluated existing training constructs and developed new materials and processes to improve training efficacy. <h3>Instructor-Led Training</h3> The first step in the NFTL evaluation process was to review existing instructor-led training (ILT) materials. These consisted of eight role-based training programs covering warehouse management, materiel management, and two methods of conducting physical inventories—each from a supervisory and non-supervisory perspective. Upon completing a role-based ILT course, end-users are granted access to job specific transaction codes and authorized to operate in Navy ERP. The NFTL merged the supervisory and non-supervisory trainings for both physical inventory curriculums. Teaching supervisors and non-supervisors together increases students’ understanding of their roles and responsibilities in the overall physical inventory process. End-user certification time was reduced by 2 days as a result of condensing training from eight to six classes. These training packages were compared with those from other Commands to ensure consistency throughout the Department of the Navy. Incorporating audit requirements and new policies produced the first authentic NAVAIR-specific curriculum for Navy ERP material management.<br> <br> A gap analysis compiled and analyzed a list of more than 180 transaction codes used for inventory and warehouse management in Navy ERP. The analysis was used to determine which transaction codes lacked end-user training material, such as videos or desk guides, and to prioritize the creation of supplemental materials for each transaction code. The gap analysis was also used to identify the appropriate training materials to be included within the ILT curriculum. <h3>Videos and Desk Guides</h3> NFTLs collaborated with a contractor to develop more than 100 videos and desk guides specifically tailored for the NAVAIR material management user community. The videos include open captioning and are intended to be played during end-users transactions in Navy ERP. Videos and desk guides were reviewed and revised by the NAVAIR ERP Business Office and the OM&S Branch (AIR 6.8.3.3) to ensure consistency and policy compliance. <h3>NAVAIR Guidebook</h3> In addition to the videos and desk guides, a guidebook was created to provide the end-user with over-arching, standardized processes across the NAVAIR enterprise. The NAVAIR Property Guidebook (NPG) provides Material Managers with a user-friendly reference document for the procurement and management of OM&S. The NFTLs, in collaboration with AIR 6.8.3.3 SMEs, developed and co-authored the “Manage-It-Right” section that includes scenarios covering the management of OM&S materiel from initial receipt to disposal. The NPG, representing current best practices for NAVAIR, is undergoing the final internal review process prior to publication. Once approved for dissemination, the NPG will be made available across the NAVAIR ERP community via a Web-based platform—SharePoint. <h3>SharePoint</h3> SharePoint is the common document management and storage system used by NAVAIR. The NFTL collaborated with the SharePoint Administrator to create the AIR 6.0 Navy ERP Training page that permits users to access training material as needed. It visually incorporates the overarching business process concepts of the NPG to provide cross-platform consistency. SharePoint provides users with access via interactive links to ILT course materials, desk guides and unique how-to videos, standard work packages, DoD instructions and NAVAIR policy. SharePoint also will be used to gather usage data and provide a user feedback mechanism regarding the perceived value of the training materials to facilitate future improvements. <h3>Warehouse Management<br> Community Forum</h3> The NFTL created a monthly forum to communicate pertinent issues directly to the material managers. This recurring meeting provides a means to socialize applicable information and create a collaborative network across the enterprise. The monthly forum is a continuous venue to share any roadblocks or success stories from the various sites. Additionally, training updates and products have resulted from recommendations made by the warehouse management community. <h3>Summary</h3> NAVAIR NFTLs are successfully mitigating the risks of the significant structural and cultural changes caused by becoming audit ready. Although much work remains to be accomplished, NAVAIR has made considerable progress toward improving audit readiness by investing in the training and preparation of its workforce. Just as NAVAIR built on lessons learned from previous conversion training, these best practices can be copied and modified for use throughout the DoD to ensure that holistic, student-focused training is the standard, not the exception, and that Congress’ audit readiness goals are fully achieved. CAPT Timothy Pfannenstein, AIR 6.0B, put it in perspective when he said, “We are expected to be good stewards of our tax dollars. Achieving audit readiness is not only our obligation to the taxpayer; it is simply the right thing to do.” <hr />Vancelette has been the Naval Air Systems Command National Functional Training Lead for Inventory and Warehouse Management for more than 2 years. Since 2010, she has served in several Navy Commands as a Navy Enterprise Resource Planning Subject‑Matter Expert in Procurement and Material Management. Conroy has been assigned to Defense Acquisition University as a professor of Life Cycle Logistics Management and of Production, Quality and Manufacturing since 2005.<br> <br> The authors may be contacted through <a class="ak-cke-href" href="mailto:laura.vancelette@navy.mil">laura.vancelette@navy.mil</a> and <a href="http://dau.mil">William.conroy@dau.mil</a>.</div>string;#/library/defense-atl/blog/Increasing-Audit-Readiness
Financial Management’s Key Role in Cybersecurityhttps://www.dau.mil/library/defense-atl/Lists/Blog/DispForm.aspx?ID=91Financial Management’s Key Role in Cybersecurity2018-09-01T16:00:00Zhttps://wwwad.dauext.dau.mil/library/defense-atl/PublishingImages/DefAcqSept-Oct18_banner2.jpg, https://www.dau.mil/library/defense-atl/PublishingImages/DefAcqSept-Oct18_banner2.jpg https://wwwad.dauext.dau.mil/library/defense-atl/PublishingImages/DefAcqSept-Oct18_banner2.jpg<div class="ExternalClassA28F1DC1BA754BFE90FC868A446D3399">Cybersecurity and its associated threats are increasing at the speed of light. The Depart­ment of Defense (DoD) is no exception. Since the DoD is responsible for our nation’s defense, cybersecurity will remain a top priority.<br> <br> How do we effectively deter and defeat cyber threats? We must understand the cybersecurity requirements and risks to our systems and utilize the expertise from all acquisition functional areas. Successful cybersecurity risk management necessitates involvement and contributions from all functional areas, not just those within information technology (IT) and engineering.<br> <br> An article—“Including Cybersecurity in the Contract Mix,” by Kimberly Kendall and William Long—in the March–April 2018 issue of Defense AT&L magazine, outlined the importance of contracting personnel and processes for sound cybersecurity management. Let us here focus on the importance of the financial management (FM) community and its associated functions throughout the acquisition process to ensure that we have accounted for cybersecurity. For the purposes of this article, the FM community includes those in cost estimating, budget formulation, budget execution and earned value management (EVM). <h3>Importance of the FM Community</h3> Everything needed to support DoD acquisition programs requires funding—including personnel, materials, systems and facilities. All requirements have a cost. This includes cybersecurity and its associated cost drivers. DoD’s process to determine and allocate funding for requirements is the Planning, Programming, Budgeting and Execution (PPBE) process. DoD Program Manager’s (PM) Guidebook for Integrating the Cybersecurity Risk Management Framework into the System Acquisition Lifecycle [September 2015]—herein referred to as “DoD PM’s Guidebook”—states that “cybersecurity resources will require funding through various types of appropriations, since cybersecurity is considered throughout the full life cycle of the program.” Acquisition teams must fully utilize the FM community to ensure that programs effectively identify and utilize funding based on cybersecurity requirements and associated cost drivers. Figure 1 outlines the major FM functions throughout the acquisition process where cybersecurity must be considered.<br> <br> <strong>Cost Estimating.</strong> Cost estimates link cybersecurity requirements to costs. Estimates are vital not only at program initiation but also for each fiscal year (FY) and for major program milestones throughout the program’s life. Cost estimators utilize the program’s Cost Analysis Requirements Description (CARD), or equivalent document, to recognize requirements and develop costs using appropriate estimation models and methods. Such estimates are integrated into a program’s Acquisition Program Baseline, used to develop program life-cycle cost estimates and as the basis for programs to construct their budget requests for inclusion in the President’s Budget submissions to Congress. Program documents containing cybersecurity requirements and associated risk factors are of particular importance for cost estimators to construct estimates. To address the affordability of cybersecurity, cost estimators should have a broad understanding of the unique cybersecurity cost drivers to ensure that applicable elements are identified and included within a program’s budget. For instance, if a program has cybersecurity requirements with rigorous software development, software testing and supply chain risk management activities, cost estimators must understand the requirements and duration for each requirement to develop realistic estimates. Accurate cost estimates form the basis for all other FM functions.<br> <br> <strong>Budget Formulation.</strong> These efforts involve transforming program cost estimates into actual budget requests within budget documents. Why are budget documents so important? Programs cannot exist without funding appropriated by Congress and those budget documents are the way programs request their needed funds. Budget formulation, with assistance of acquisition team members, requires identifying cybersecurity requirements and associated amounts needed each FY by appropriation. Those budget documents are reviewed by the Office of the Under Secretary of Defense Comptroller (OUSD[C]) and Congress. Whereas the OUSD(C) supports the DoD and its programs, Congress maintains responsibility for appropriating funds and providing DoD’s programs budget authority. The DoD PM’s Guidebook states that programs should include cybersecurity requirements as an identifiable line with a program’s budget. That requirement is critical because budgets must be defendable and written clearly so that the requirements can be understood by stakeholders independent of the program office. If cybersecurity requirements are not properly projected by cost estimators, a program’s budget documents likely will not reflect the appropriate requirements or associated funding.<br> <br> <strong>Budget Execution.</strong> These efforts revolve around funding execution, once programs receive budget authority, on contracts or other vehicles as specified in program budget documents. They include the creation and maintenance of spend plans per FY and appropriation to demonstrate how the program will use funding appropriated by Congress. Input from other acquisition team members is required to ensure that the plans are accurate, realistic and incorporate all planned program requirements (including cybersecurity). Spending plans may encompass obligations or expenditures and are tracked against actual execution rates. Actual execution rates and comparisons to spend plans are of significant importance for programs since they are a key measurement for evaluating program performance. They not only are tracked by OUSD(C), but also used by Congress when considering future program budget requests. Execution personnel are key contributors for completing program Select and Native Programming Data Input System (SNaP-IT) reports on IT/cybersecurity budgets. The SNaP-IT reports are another requirement for programs to justify their cybersecurity activities and funding amounts since programs must report actual spending and future planned spending. Finally, execution personnel can initiate or complete actions (such as submit unfunded requirements or reprogramming requests) for programs if urgent needs or shortfalls arise, for example, due to emerging cybersecurity threats or vulnerabilities.<br> <br> <img alt="" src="/library/defense-atl/DATLFiles/Sept-Oct2018/DefAcq_SeptOct2018_Article2_Figure1.jpg" style="margin-left:3px;margin-right:3px;float:left;width:454px;height:473px;" /><strong>EVM. </strong>This is a valuable program management tool for evaluating cost, schedule and technical performance on contracts, including cybersecurity. EVM measures past performance, forecasts future performance and incorporates risk factors to support program decisions. Military Standard (MIL-STD) 881D Work Breakdown Structures for Defense Materiel Items (April 9, 2018) emphasizes the importance of cybersecurity and actions that programs should take to better manage cybersecurity requirements. It states that, “Attention must be paid to cybersecurity at all acquisition category levels and all classification levels, including unclassified, throughout the entire life cycle….” MIL-STD-881D provides the structure for programs to identify, measure and report crucial cybersecurity-related costs. It instructs programs to break out specific cybersecurity elements (hardware or software) within the work breakdown structure (WBS) where those costs can be easily accounted for. If elements are separated within the WBS, as opposed to being commingled with other program requirements, programs will have enhanced ability to measure actual performance against planned expectations. <h3>Cybersecurity Best Practices<br> —FM Perspective</h3> <strong>Involve the FM Community. </strong>Acquisition programs can better manage resourcing for cybersecurity requirements if they involve the FM community early and often. Not only should programs proactively evaluate cybersecurity requirements throughout the entire acquisition life-cycle, they should consistently leverage the FM community because of the critical functions its members complete. If FM personnel have no active role or understanding of the requirements and cost drivers, programs risk not having appropriate cost estimates, budgets or effective evaluation capabilities.<br> <br> <strong>Involve All Functions to Identify Cybersecurity Costs. </strong>Cybersecurity cost drivers span all acquisition functional areas. Since cybersecurity requirements and risk factors are unique to each program, acquisition teams should consider all potential requirements at program initiation and each milestone with respect to FY and appropriation. FM functions can only be accurately executed from direct interactions with the other acquisition functions. Table 1 outlines the major acquisition functions and potential cybersecurity cost drivers. Many of these cost drivers are derived directly from the DoD PM’s Guidebook and MIL-STD-881D. This list is provided for illustrative purposes only, as several activities may be shared between acquisition functions.<br> <br> Effective communication and coordination are required for a successful team-based approach when resourcing cybersecurity requirements. Figure 2 depicts the relationship between the acquisition process (milestone/event driven) and the PPBE process (calendar driven). Cybersecurity requirements and associated cost drivers, like other system requirements, must be included in the CARD and Program Office Estimate and considered throughout the program lifecycle. They shall also be reflected in the PPBE process through budget documents, funds execution/reporting, and evaluation. The durations of life-cycle phases are unique to each acquisition program and determine the number of PPBE cycles executed.<br> <br> <img alt="" src="/library/defense-atl/DATLFiles/Sept-Oct2018/DefAcq_SeptOct2018_Article2_Table2.jpg" /><br> <strong>Validate Financial Reporting. </strong>Accurate financial reporting is critical for program success and supporting and defending current and future budgets. In an environment with elevated accountability for taxpayer resources and increased congressional interest in the cybersecurity threat, programs must accurately report cybersecurity budgets and requirements (such as spend plans and SNaP-IT reports). All acquisition functional areas play a key role in ensuring cybersecurity is accurately represented in financial reporting activities.<br> <br> <strong>Leverage DAU Resources. </strong>DAU continues to assist DoD’s acquisition community with integrating cybersecurity into existing processes across the DoD acquisition life cycle. Resources include online tools, courses, articles and specialized training or workshops. DAU’s specialized training has helped numerous programs better understand concepts critical to designing and maintaining cyber resilient systems. Also, DAU’s Cybersecurity and Acquisition Lifecycle Integration Tool outlines the major cybersecurity activities and interaction with existing processes at each phase of the acquisition life cycle in accordance with DoD Instruction 5000.02, “Operation of the Defense Acquisition System.”<br> <br> <img alt="" src="/library/defense-atl/DATLFiles/Sept-Oct2018/DefAcq_SeptOct2018_Article2_Figure2.jpg" /> <h3>FM Community’s Cybersecurity Challenges</h3> Several cybersecurity-related challenges exist for the FM community within programs. First, cybersecurity requirements are relatively new and cost drivers are unique to each program. As a result, cybersecurity cost estimates can vary widely. And minimal historical data increases assumptions and application of risk factors. Varying degrees of program complexity with associated cost drivers only further complicate cost estimating activities. For example, developing cybersecurity cost estimates for a legacy defense business system will be vastly different than those of a new missile program. One program may have more critical hardware components or unique software algorithms, require more testing and have a longer program life cycle. Unique cost drivers, risk factors and durations can have potentially large impacts on program cost estimates for individual requirements and life-cycle costs.<br> <br> Second, emerging or changing cybersecurity threats can drive unexpected requirements changes for programs. Combined with the federal government’s calendar-driven budget process for programs to submit budget requests and receive appropriated funding from Congress, programs may encounter undesirable challenges. It can be difficult to define requirements for the current year, let alone several years in the future, as budget requests may not be appropriate from time of request to actual time of use. Challenges will only become more difficult to manage should emerging cybersecurity threats delay schedule due to technical risk mitigation. That also could wreak havoc on program spend plans since they are a key performance-tracking mechanism. For those reasons, it is increasingly vital that programs should involve FM personnel, as they initiate or complete various courses of actions to adjust program funding as requirements change.<br> <br> Third, it is difficult to measure cybersecurity performance. Whereas typical contract requirements have independent WBS elements, cybersecurity requirements are not always independent elements and are instead embedded within other WBS elements (such as systems engineering, system test and evaluation, and program management). The lack of direct traceability to cybersecurity requirements makes oversight and evaluation functions difficult for programs, specifically the FM community. MIL-STD-881D has provided additional guidance to help programs better measure cybersecurity performance. <h3>Conclusion</h3> The impact and dynamic nature of current and future cybersecurity-related threats on our personnel, systems and facilities cannot be overstated. A proactive and flexible approach to deter and defend against cybersecurity threats must involve all appropriate stakeholders; responsibilities extend to all members of the acquisition workforce, not just IT and engineering. Successful integration of cybersecurity into existing acquisition processes, including FM, is critical to the success of DoD programs. FM community personnel, like those of the contracting community, are critical members of the acquisition team and perform vital functions to ensure program success. DoD will not be able to deliver effective capabilities to the warfighter for defending our homeland and allied nations against threats if we do not adequately fund cybersecurity requirements. <hr />Speciale is a professor of Financial Management at the Defense Acquisition University (DAU) South Region in Huntsville, Alabama. Kendall is a professor of Cybersecurity at DAU-South.<br> <br> The authors can be contacted at <a class="ak-cke-href" href="mailto:stephen.speciale@dau.mil">stephen.speciale@dau.mil</a> and <a class="ak-cke-href" href="mailto:kim.kendall@dau.mil">kim.kendall@dau.mil</a>.</div>string;#/library/defense-atl/blog/Financial--Management’s--Key-Role-in-Cybersecurity
Feedback, Follow-up and Accountabilityhttps://www.dau.mil/library/defense-atl/Lists/Blog/DispForm.aspx?ID=92Feedback, Follow-up and Accountability2018-09-01T16:00:00Zhttps://wwwad.dauext.dau.mil/library/defense-atl/PublishingImages/DefAcqSept-Oct18_banner3.jpg, https://www.dau.mil/library/defense-atl/PublishingImages/DefAcqSept-Oct18_banner3.jpg https://wwwad.dauext.dau.mil/library/defense-atl/PublishingImages/DefAcqSept-Oct18_banner3.jpg<div class="ExternalClass9D155C808D1D4045A0913805C71C5E1F">The Department of Defense (DoD) needs feedback, follow-up and accountability in its acquisition programs so that it can effectively execute its strategic plans. But what do those terms mean, how is each attained and how do we know when they are?<br> <br> Most people already know The Three Musketeers, and everybody knows the Three Stooges. Both have been the source of unique management approaches for many years. Or at least, it sure looks that way sometimes.<br> <br> Below are some thoughts and excerpts from a book I wrote titled Fixes That Last—The Executive’s Guide to Fix It or Lose It Management. It was written for the private sector but has the same applicability to program managers and other DoD acquisition management professionals. <h3>Feedback</h3> Feedback is communication (in whatever form) that you receive regarding something that your organization plans to do or already has done. It is an indispensable part of the decision-making process—whether in strategic planning or in day-to-day operations. Ideally, feedback means continuous information on performance against accepted standards.<br> <br> However, before you can expect meaningful feedback, think about the following:<br> <br> Your feedback requirements should be clearly stated, in writing. Whatever you want done may not get done, if there is no feedback system. Ensure therefore, that feedback mechanisms exist. If there is no established feedback system, you will need to create something, even if it’s only temporary.<br> Feedback is a two-way street; all stakeholders need to know the findings of the feedback process as much as you do, so make sure that they stay informed. Be alert for unexpected obstacles or surprises.<br> <br> Additionally, feedback, because of its content, may soon become obsolete. You need to get it quickly, and you may have to go out and collect it in person. Personal observation has a history of thousands of years in both management and in the military and is always a good choice. It has been said that, “The greatest fertilizer is the footprint of the overseer.”<br> <br> Meaningful feedback, in whatever form, must flow unimpeded in both directions. Feedback can be formal or informal, written or unwritten. It also may be a combination of all four methods, since it is usually best to document communication about important subjects.<br> <br> Whenever possible, written feedback should have its own internal documentation and reference authority, to be both credible and useful. Long-term planning suffers when feedback becomes a deluge of unsubstantiated opinions, beliefs and prejudices.<br> <br> Meaningless feedback includes statements such as I don’t like it; it’s all messed up; we’ve never tried it before; or we have tried it (or something like it) before and it didn’t work.<br> Meaningless (and sometimes nasty) feedback with no basis in evidence squanders time, depletes enthusiasm, and can really sabotage your program. You should encourage open (but finite) discussion and dissention. Dissenters should be welcomed, but they need to provide solid, replicable evidence to support their dissention; the more quantifiable, the better.<br> <br> Similarly, positive feedback should also include hard facts and objectivity. Whenever possible, metrics that characterize both the old and new processes (e.g., gallons of water saved) should be included to better communicate the new process’ impact on the organization.<br> <br> Informal, oral or otherwise unsubstantiated communication and feedback also opens the door to “perception” and causes us to screen or filter what we thought was said, to come up with something partly or entirely wrong.<br> <br> Requiring a formal and quantifiable feedback process denies people the pleasure of suffering in silence, complaining, or just going with the flow. Mandated feedback also is a guaranteed source of two-way communication. All stakeholders need to know and understand what is going on—and, if they don’t, they should ask. If they have it wrong, you need to straighten them out.<br> <br> Knowing how to communicate, and even being able to do it eloquently, does not guarantee that your message will get across to everyone whom you intend to get it. Obtain feedback in more than one form and from more than one source. Establish multiple communications channels. Personally observe or interact by talking to people to see how the message was received. Determine the sensitivity of those who will receive your message. Lastly, reinforce words with actions or presentations, if necessary.<br> <br> <strong>Measurable Feedback</strong><br> Decision makers, both military and civilian, need to apply metrics and measures of effectiveness to all areas of their operations, to meaningfully quantify: <ul> <li>Information collection and dissemination</li> <li>Risk, vulnerability and the allocation of limited resources</li> <li>Optimal data collection and reporting procedures</li> <li>Implementation status of goals and objectives</li> <li>Alternative courses of action</li> <li>Situational awareness (internal and external)</li> </ul> <h3>Subjective and Objective Metrics</h3> Metrics can be either subjective (i.e., conclusions based on observations, experience and judgment) or objective (based on collected data).The tables that follow describe core subjective and objective metrics used to measure the potential effectiveness of operations. <h3>Benchmarking and Gap Analysis</h3> Originally, benchmarking meant finding a “best practice” in another organization and comparing it with the same process in your organization. This is not incorrect, and certainly not unnecessary. However, the intent of this paper is to help you get focused within your organization. This is internal benchmarking and is discussed below.<br> Industrial engineers often conduct variance analyses, or tests for significant differences between several mean values. This is (happily) not what we are talking about. For our purposes, gap analysis is the product of auditing the organization or specific processes (perhaps with the checklists). The analysis provides general indicators and not hard figures. You will be measuring gaps between what is expected and the actual conditions.<br> <br> The next step after identifying the vulnerabilities is to measure their magnitude against an accepted standard or reference, then determine the “gap” between the desired and the existing. This is gap analysis (Figure 1).<br> <br> Having identified the gap (and the associated metrics), we can then proceed to the gap analysis, to determine if there is a gap or a difference between what could be reasonably expected and what actually occurs, plus where, specifically, is the gap (i.e., what area or process), what can be done to close the gap, and whether the corrective action is reasonable, cost effective, appropriate, legal, and ethical.<br> <br> <img alt="" src="/library/defense-atl/DATLFiles/Sept-Oct2018/DefAcq_SeptOct2018_Article3_Figure1.jpg" style="width:394px;height:316px;margin-left:3px;margin-right:3px;float:right;" /><br> <strong>Internal Benchmarking </strong><br> Internal benchmarking examines your own activities, taking place inside your walls. Areas always ready (and in need of) internal benchmarking include but are not limited to: <ul> <li>Facilities</li> <li>Manufacturing and material handling processes</li> <li>Administration</li> <li>Training and qualification</li> <li>Costs of operations</li> <li>Inventory levels and stock turnover</li> <li>Waste, work in progress, reject rates</li> <li>Other work sites in the same organization (as applicable)</li> <li>Purchasing/procurement</li> <li>Contracting</li> </ul> <br> <strong><img alt="" src="/library/defense-atl/DATLFiles/Sept-Oct2018/DefAcq_SeptOct2018_Article3_Table1-2.jpg" style="margin-left:3px;margin-right:3px;width:381px;height:788px;float:right;" />External Benchmarking</strong> <ul> <li>External benchmarking can include (among other things):</li> <li>Customer satisfaction (on-time delivery, reliability/defect reports, etc.)</li> <li>Competitors’ products</li> <li>Recommendations from external consultants and auditors</li> <li>Public databases</li> <li>Annual reports of other companies</li> <li>Government agencies</li> <li>ISO (International Organization for Standardization) 9000, ISO 14000, and other international standards</li> <li>Tabletop exercises, seminars and workshops</li> </ul> <h3>Follow-up</h3> Like feedback, follow-up is vital to an organization’s survival and success. It also takes some explaining. The first thing to explain is the difference between feedback and follow-up, and why the two terms are not interchangeable. For our purposes, feedback is needed before and during the implementation, whereas follow-up comes afterward. Both are essential and a valiant attempt to do one does not obviate the need for the other. It is very annoying to find something in the follow-up stage that could have (and should have) been found during the feedback stage.<br> <br> Follow-up means (among other things) checking on the success or failure of a process implemented, a process changed, an order given, or some other modification done with thought to making something better or, in some way, adding value. For instance, a process modification developed to solve a quality issue on the factory floor, could result in one of the following: <ul> <li>No measurable change to the product</li> <li>Further product degradation (you made it worse, you dope!)</li> <li>Improvement, but not enough to justify the added effort or expense</li> <li>Significant (measurable) product improvement, in accordance with the modification strategy, and worth the added effort or expense</li> </ul> Some authors discuss follow-up as a one-time process, with an outsider (or team of outsiders) selected to conduct it, and then only after a rigorous selection process. I would suggest that this is not what you are interested in. You most likely will need to do your follow-up internally. Besides being faster and cheaper, it will serve all your purposes, and show the stakeholders that you are capable of identifying and correcting your own problems.<br> <br> You do need, however, to ensure that your follow-up processes include (at a minimum) the following: <ul> <li>Comparing the actual performance of the implemented system or process with not only the past, but what was expected</li> <li>Verifying that specifications, designs, etc., were fully implemented as planned</li> <li>Assessing the possibility of further (i.e., continuous) improvement. The process, however successful or efficient, should never be thought of as “frozen.” Rather, it should remain subject to continuous review and improvement.</li> <li>Documenting fully the follow-up process thus far and for the future</li> </ul> You also need to know if your people are resisting the changes implemented. Their good reasons can be reflected in subsequent changes to the process.<br> The ISO 9000 requires that changes to processes be fully documented, and include documentation that follow-up was scheduled and conducted and that the findings were compared with process findings prior to the change. In other words, “Did it do what we wanted or do we need to try something else?”<br> You change a process to add value to it. It is only by following up that you find out whether the change did any measurable good or in some way added value. <h3>Accountability</h3> Webster defines “accountability” as “having to report to, explain, justify; being responsible, answerable.” Early management textbooks and courses routinely linked authority with responsibility, stating that one cannot exist without the other. Unfortunately, accountability was not always included.<br> Here are basic definitions to help explain the three relative to each other: <ul> <li>Authority is right of an individual to make the necessary decisions or take the necessary action required to achieve the objectives.</li> <li>Responsibility is the obligation for completion of the objectives.</li> <li>Accountability is the acceptance of the success or failure to achieve the objectives. It often carries with it positive (e.g., promotion) or negative (e.g., termination) recognition for the success or failure.</li> </ul> Heavy thinkers believe that responsibility and accountability are synonymous, and that separating the two causes unnecessary formality and confusion. They state (not inappropriately) that accepting responsibility for a project creates an obligation to perform and therefore an implicit answerability or accountability. Another thought is that responsibility does not automatically include accountability, because accountability involves a third party—someone above the responsible manager to whom he or she must give an accounting. Both positions are simple, obvious and too often ignored. I recommend that accountability be considered separately, ensuring that when an objective is assigned, authority, responsibility and accountability are obvious, stated, realistic and measurable.<br> <br> Accountability, as a separate but related concept, can’t be talked about too much, especially in these days of unbridled greed and arrogance seemingly running amok. ENRON, Fannie Mae, Volkswagen, the Department of Veterans Affairs, and Bernie Madoff are names that have scandalized 200 years of responsible management, not to mention 2,000 years of the philosophy of ethics. They underscore the need to hold decision makers and their superiors accountable when predictable catastrophes take place and/or innocent people are hurt.<br> <br> Most (but not all) people in positions of leadership and responsibility understand their attendant accountability. Ships’ captains understand that they are in command but that accountability to higher authority comes with the job and is, as it should be, inescapable. <h3>Accountability of “Teams”</h3> Can the same be said about the accountability of teams and the individual members of those teams? Are they accountable as individuals, or are they safe to do (or not do) whatever they want?<br> <br> When I was a management student in the early 1960s, “middle management” was both the focus of the training and the prize most sought after. Now, it seems that you have to aim to get on a team, much as a congressman aims to get a good committee assignment.<br> <br> Middle management, once the backbone of (and launching pad for) an industrious America, is being replaced by teams of every shape and description. Middle managers once were held accountable for the success or failure of their organizations for two excellent reasons: that was how the work got done, and that was also how middle managers got to be top managers.<br> <br> Today, teams populate the landscape as middle managers are down-sized or marginalized. Consultants come out of the woodwork, cover conference room walls with butcher paper, and “empower” teams by eviscerating managers and management. If you are going this route, good luck. Just be sure that the teams have the same authority, responsibility and accountability that the middle managers had.<br> <br> I once attended a “cross-functional team” meeting with a (Navy) client, where Churchill-like oratory flowed but little got done because the poor bastard at the head of the table had no authority to task the members, who, in turn, felt no responsibility or accountability for doing what was tasked. It wasn’t pretty. Sound familiar? <h3>Summary</h3> Well, there they are: feedback, follow-up and accountability—as simple as I can make them. Whether they resemble the Three Musketeers (“All for One and One for All!”) or the Three Stooges (“Spread out!”) in your command or program is up to you. <hr />Razzetti is a retired Navy captain, management consultant, auditor, and military analyst. He is the author of five management books, including “Fixes That Last—The Executive’s Guide to Fix It or Lose It Management.”<br> <br> The author can be contacted at <a class="ak-cke-href" href="mailto:generazz@aol.com">generazz@aol.com</a>.</div>string;#/library/defense-atl/blog/Feedback,-Follow-up-and-Accountability
Agile Values in the MQ-9 Reaper’s Software Developmenthttps://www.dau.mil/library/defense-atl/Lists/Blog/DispForm.aspx?ID=90Agile Values in the MQ-9 Reaper’s Software Development2018-09-01T12:00:00Zhttps://wwwad.dauext.dau.mil/library/defense-atl/PublishingImages/DefAcqSept-Oct18_banner.jpg, https://www.dau.mil/library/defense-atl/PublishingImages/DefAcqSept-Oct18_banner.jpg https://wwwad.dauext.dau.mil/library/defense-atl/PublishingImages/DefAcqSept-Oct18_banner.jpg<div class="ExternalClass4F6D8FA4DDDF4845A588F549638F719C">At any conference or forum on defense ac­quisition in the last few years, you would have heard one word repeated almost relentlessly: “Agile.” Search the Web and you’ll find a number of articles and briefing charts highlighting the challenges of adopting Agile principles in the context of a weapon system. Typically, the discussion centers on shifting away from the waterfall approach, addressing systems engineering, and getting through various certification and testing authorities. Let me share my experience in successfully applying Agile values and principles to the acquisition program of the Special Operations Command (SOCOM) MQ-9 Reaper unmanned air vehicle (UAV) without dictating a particular software development methodology. <br> <br> The histories of the MQ-1 Predator and MQ-9 Reaper UAVs have been well documented in various media. In 2006, a validated requirement emerged from the Quadrennial Defense Review for a Special Operations Forces (SOF) “unmanned aerial vehicle squadron to provide organic capabilities.” With requirements in hand, a new modification program, called Medium Altitude Long Endurance-Tactical, or MALET, was initiated between SOCOM’s acquisition office and the Air Force Predator Systems Squadron. Under an agreement between the two offices, requirements and funding would be generated by SOCOM, while contract management would largely be executed by a team at Wright-Patterson Air Force Base in Dayton, Ohio. In parallel, Air Combat Command (ACC) agreed to transfer personnel, aircraft and ground control stations to Air Force Special Operations Command (AFSOC) to stand up MQ-1 and MQ-9 Combat Air Patrols directly supporting SOF missions. <br> <br> Initially, the MALET program operated somewhat independently from the Air Force acquisition program. Modifications were typically “bolt-on” upgrades that didn’t significantly alter the baseline configuration. After twice fielding a software release—also called Operational Flight Program (OFP)—derived from an ACC version in order to support urgent requirements, SOCOM sought increased independence over its fielding schedules and initiated a Combat Evaluation called Lead-Off Hitter (LOH). Originally, the effort was an 18-month acquisition experiment to rapidly assess and field new capabilities on a limited number of AFSOC aircraft. While the total scope of the Combat Evaluation included weapons, radio and sensor upgrades, the glue holding the effort together would be a SOF-unique software line for the MQ-9 developed and maintained under contracts managed by the government team and executed by a dedicated cross-functional team at General Atomics-Aeronautical Systems, Inc. (GA-ASI). Because of the encompassing nature of software running on the aircraft and ground control station, the software development program and associated contract were also referred to as LOH. Given the choice in numbering SOCOM’s MQ-9 OFP, engineers with GA-ASI designated this new software line as the “2400 series” to distinguish it from ACC’s “900 series.” This was a reference to Major League Baseball Hall of Famer Ricky Henderson, the greatest lead-off hitter of all time.<br> SOCOM’s expectations for frequent releases required a new approach. The legacy software development program, following a traditional waterfall development strategy, started with a fixed set of performance and testing requirements as defined in the contract Statement of Work (SOW). These requirements could only change through contractual modifications requiring additional funding, schedule and management approvals. The initial LOH program manager was familiar with service-type contracts supporting software updates for aircraft training systems and sought to implement the same approach in a weapons system OFP development contract. She directed the engineering team to develop a repeatable process and a SOW that would support it. <br> Crafting this process began with the assumption that software development primarily would be schedule-driven and only capabilities that could be developed, integrated and tested in a given time would be considered for inclusion in each release. Development cycles were set at 12 months, with half of the schedule dedicated to code development and the other half to testing. Releases would overlap each other, resulting in releases to the field every 6 months. Multiple demonstrations of working code in a Software Integration Lab with AFSOC staff and aircrew were planned during development to refine performance requirements with program office engineers in attendance to limit feature creep.<br> During initial government reviews of the SOW, the procuring contracting officer (PCO) expressed concerns that the contract deliverables were not defined sufficiently and that the contract could be interpreted as a “blank check.” To address these concerns, performance metrics were introduced—such as planned hours versus actual hours, and the percentage of test points completed in the first pass. (Later SOWs removed contractual ties to these metrics due to Air Force contracting policy changes and the contractor’s demonstrated performance.) The PCO also determined that the contract was for a supply and that individual releases weren’t considered deliverables. A high-level depiction loosely based on the planned and actual development schedules during execution of the LOH program is shown in Tables 1a and 1b.<br> <br> The contractor management team realized early on that a schedule dependency was required to continuously employ a single dedicated team. Specifically, you could not enter the test planning of one OFP until you finished test execution of the prior OFP. This would then determine how soon the team could start the requirements definition phase. Another realization was that a single software release could span two contracts as depicted by OFP 4 in Table 1a and OFP 5 in Table 1b. This was acceptable to the program office since the same contractor team would bridge both contracts. As long as the second contract was awarded and contractor charge lines were established in time, there would be no breaks in the program. During the execution phase, the LOH contract was flexible in handling deviations to the plan, as depicted in Table 1b.<br> <br> In our program, OFP 1 development began under a separate contract, and coding and testing times were reduced. During OFP 3 flight testing, critical software deficiencies were discovered and an unplanned OFP 3.1 was required. As a result, OFP 4, previously planned as a larger release with a longer timeline, was delayed by 2 months. No contractual action beyond a PCO letter (PCOL) was required to make this programmatic adjustment. Because of the test team’s schedule dependency during OFP 4, OFP 5’s overall timeline was extended. This offered additional time to work on user requested features that mainly were human-machine interface related in this instance. OFPs 6 and 7 show what could have happened in returning to the previous release cadence if other factors—such as expanding aircraft configurations had not extended development and testing timelines.<br> <br> <img alt="" src="/library/defense-atl/DATLFiles/Sept-Oct2018/DefAcq_SeptOct2018_Article1_Table1-2.jpg" /><br> Each time the potential arose for scope changes, the customer was offered three courses of action: <ul> <li>Create a new release for a critical capability or improvement.</li> <li>Include the capability to the next release with an </li> <li>associated schedule impact.</li> <li>Defer to the following release under the existing scope determination process. </li> </ul> Within 3 years, the team fielded six OFPs, delivering more than 50 system-level customer requirements, more than 500 contractor requests for software changes, and resolving more than 100 test and operational deficiency reports. While no release was delivered on the date that was projected at project inception, slips were measured in days and weeks—not months and years. Capabilities largely met the customer intent with the knowledge that operational feedback could be incorporated in the next or the following release and fielding was likely in a 6- to 12-month period. Therefore, AFSOC accepted less than full compliance with the stated performance goals.<br> Weekly video teleconferences were held with the user command staff and operational squadrons to provide status updates and discuss possible problem resolutions. Trust was the currency used to keep the program on track and customer expectations in line with the delivered product. The ability of the combined government and contractor team to deliver the desired product directly translated into increased trust on the part of operational commanders in employing the MQ-9 under increasingly difficult and strategically important conditions. The Senate Armed Services Committee, in its report for the Fiscal Year 2015 National Defense Authorization Act, commended the team’s approach and said: “The committee strongly supports SOCOM’s efforts to accelerate fielding of advanced weapons, sensors, and emerging technologies on its fleet of MQ-9 UAVs through the MQ-9 [MALET] program of record utilizing the Lead-Off Hitter rapid acquisition process.” Recently, ACC officially directed the program of record to employ the AFSOC OFPs and ended its own separate software development program. The two commands now work closely together on maintaining the speed of the LOH release cycles, delivering the full sustainment tail typical of an ACC program and providing a unified prioritized capability list to the program office to guide future capability investments. <br> <br> <img alt="" src="/library/defense-atl/DATLFiles/Sept-Oct2018/DefAcq_SeptOct2018_Article1_Table2.jpg" /><br> Responding to SOCOM’s need to deliver capability quickly and reliably required the MALET team to approach software development acquisition strategy in a different way. Key assumptions mapped to Agile Values and the resulting benefits are listed in Table 2. <br> <br> Establishing a repeatable process and schedule framework for each release through a series of sequential contracts made our customer happy by increasing the frequency of deliveries over previous release cycles. The “Level of Effort” type contract enabled the team to respond to emerging needs by not contractually locking in technical requirements. Establishing and changing technical content of a particular release was handled via PCOLs rather than by contract modifications.<br> <br> A dedicated contractor team executing a repeatable process also led to stable funding requirements benefiting SOCOM, predictable workforce planning benefiting the contractor and simplified contract management benefiting the program office. By keeping cost constant, the MALET program manager was able to quantifiably demonstrate the trade-offs between scope size and schedule. Adding capability requirements during development, while highly discouraged, could be accommodated with ripple effects to future release but without the time penalty of a contract modification.<br> <br> In leveraging mature hardware, our customer based its requirements on demonstrated performance, and the program team reduced the risk of delay due to unforeseen suitability or effectiveness issues. Additionally, by only taking on work that could be accomplished within the given schedule, the program office reduced the risk that one capability would hold up other capabilities within the same release cycle. The dedicated software development team combined with the flexibility in defining performance requirements allowed the contractor to propose numerous maintenance fixes during each release cycle.<br> <br> From the government’s perspective, as long as the delivery schedule wasn’t impacted, these proposals were approved and even encouraged. Agreements with the program office sustainment team provided operation and maintenance funding to code and test these fixes at the unit level. Funding for LOH research, development, testing and evaluation then paid for integration and testing at the system level. Frequent interactions such as technical interchange meetings to clarify requirements, multiple software demonstrations with aircrew, and flight manual working group meetings built trust between team members crossing many organizational and functional lines. <br> <br> While Agile principles may sound obvious, our team also benefited from a few critical enablers. First, a high level of experience and expertise was resident in each team member and a high level of trust had been established within the team during earlier SOCOM MQ-9 projects. Second, since the program costs did not exceed a particular threshold, the level of oversight for this program was held at the Program Executive Officer level and acquisition documentation such as the System Engineering Plan and Test and Evaluation Master Plan was reduced. The team was free to tweak the process while executing the program. (The PEO later confessed that he was going to “pull the plug at the first sign of trouble” but was glad that he didn’t.) Third, the customer showed great patience and outreach throughout this journey. Participating day-to-day in the development program gave the customer ownership of problems and solutions. Furthermore, the customer advocated for and received general officer approvals for increased operational risk resulting from less-than-perfect designs and technical orders in order to speed fielding of capability. <br> <br> These assumptions and enablers aren’t the only considerations in shifting away from a pure waterfall approach for fielding capabilities. There are natural technical limits that an Agile approach cannot solve (i.e., you can’t “Agile” your way past the speed of light). There are organizational, cultural and legal considerations that each acquisition program office must address. Program offices will need properly trained personnel to craft appropriate contractual language. Government and industry teams must work closely to define expectations. Rules of engagement must be established between developers and operators to mitigate scope creep. Major Defense Acquisition Programs with Earned Value Management System reporting, oversight from the Office of the Secretary of Defense, and congressional interest, must garner advocacy and buy-in to earn the trust of senior decision makers to waive or tailor reporting and milestone decision requirements. Program offices should make the effort toward Change Management to ensure a smooth transition when developing and implementing Agile Business Principles. Separating hardware development programs from integration, testing and fielding increases the number of contracts to manage and the potential interdependency of each. This approach may increase program office workload and risk management. <br> <br> With “Agile” and “agility” becoming acquisition buzzwords, it’s important to properly apply the Agile Values and Principles to each individual program. The Agile Manifesto was written by software developers primarily to address information technology platforms using commonly available and standardized computer components and applications. Military aircraft are built to unique, detailed specifications and operated by highly trained aircrew and maintenance personnel. Comparisons to Facebook and Apple can fail to appreciate the differences between a commercial product operating on your desk or in your hand and an aircraft carrying humans thousands of feet in the sky, hundreds of miles an hour in reconnaissance, transport, and strike missions.<br> <br> Certification authorities and independent test organizations may find it hard to meet their established processes and timelines given the pace of new software releases. Senior leaders responsible for these activities should embrace the Agile principles and challenge their organizations to develop their own practices to meet the need. Concepts such as open architecture and digital thread offer opportunities to review and approve modular designs across multiple platforms, minimizing the costs of system‑level assessments. <br> It is often said that Agile is a team sport. As far as clichés go, this one gets it right. Changes are required from all participants in the acquisition process to reap the benefits of an Agile approach. <hr />Smith is the Unmanned Aircraft Systems (UAS) Airworthiness Technical Expert for the Air Force Lifecycle Management Center, Engineering Directorate, Airworthiness Office, at Wright-Patterson Air Force Base in Ohio. He has more than 10 years of experience in UAS and Intelligence, Surveillance and Reconnaissance, working in the Air Force Research Laboratory and the MQ-9 Reaper program office. The views expressed are his own and do not necessarily reflect those of the U.S. Air Force, the Special Operations Command, or the Department of Defense.<br> <br> The author can be contacted at <a class="ak-cke-href" href="mailto:andrew.smith.56@us.af.mil">andrew.smith.56@us.af.mil</a>.</div>string;#/library/defense-atl/blog/Agile-Values--in-the-MQ-9-Reaper’s--Software-Development