Table of Contents
U.S. Department of Justice
Federal Bureau of Prisons
Program Statement
OPI: PRD
NUMBER: 1210.23
DATE: 8/21/2002
SUBJECT: Management Control and Program Review Manual
PURPOSE AND SCOPE
To prescribe policies, standards, and procedures to establish, maintain, evaluate, and improve Bureau internal systems of control; to prescribe policies, procedures, and responsibilities for management of the accreditation process, and participation in American Correctional Association (ACA) sponsored activities; and to ensure the Bureau responds in a timely, accurate, and concise manner to all inquiries, surveys, requests, and audits from external audit authorities, and that findings and recommendations from external audits are effectively reviewed and constructively applied.
These provisions apply to all Bureau organizational components and installations, including divisions, regions, institutions, community corrections offices, and oversight function of private contract facilities.
In accordance with 31 U.S.C. § 3512(b)(1), Executive Agency Accounting Systems, and OMB Circular A-123, Internal Control Systems, each Federal Government agency is required to establish a continuous process for evaluating and improving its internal control systems.
Each DOJ agency head must annually submit an assurance statement to the Attorney General certifying that the agency is:
- operating effectively, efficiently, and in compliance with applicable regulations; and
- that existing systems of internal control adequately protect the agency’s resources against fraud, waste, abuse, and mismanagement.
The assurance statement must also identify any systemwide control weaknesses, and actions taken or planned to correct the weaknesses in an appropriate and timely manner.
For the agency head to make this certification, there must be a systematic approach to assessing operations and programs at all organizational levels. This is achieved through a management control program that includes a system for assessing risks and testing the adequacy of internal controls for all program and administrative areas. This Program Statement (PS) outlines the requirements and responsibilities for implementing an effective management control program.
It also establishes, for all levels of the Bureau, a system of assurance that, taken as a whole, permits the Director to submit the required annual certification to the Attorney General.
The Bureau enhances the effective management of its institutions through the Commission on Accreditation for Corrections (CAC) accreditation based on standards approved jointly by the ACA and the CAC.
Many external audit authorities have an ongoing interest in Bureau programs and operations for regulatory oversight, as well as inquiries reflecting the public’s interests. Such external evaluations can be useful to validate the Bureau’s own internal system of checks and balances, particularly operational and program reviews.
A revised Management Control and Program Review Technical Reference Manual is also being issued to supplement this PS. It contains all relevant samples for report preparation. The union may request any documents related to this policy and such requests will be considered under 5 USC 7114.
SUMMARY OF CHANGES
The following are highlights of this revised Program Statement:
- A Table of Contents has been added.
- Language and criteria for CAC and ACA sponsored activities has been included.
- Language and criteria for Liaison with External Audit Authorities has been included.
- In Chapter 2, the process of conducting regional office program reviews has been revised.
- In Chapter 2, the institution follow-up review time frame has been changed to 120 – 150 calendar days.
- New language in Chapter 2 concerning the program review final report to note those deficiencies that need a separate, specific response from the Chief Executive Officer (CEO).
- In Chapter 2, including the Data Sheet information in the program review final reports is eliminated, and that information in the Background Information section of the reports is included.
- In Chapter 2, the department head is included in the pre-assessment phone contacts.
- In Chapter 2, the criteria for program review ratings is further defined.
- In Chapter 3, the Community Corrections Regional Administrator (CCRA) is designated as the review authority for operational reviews.
- In Chapter 3, the working papers and associated correspondence for Community Corrections Management (CCM) operational reviews must be maintained in the CCM office where the operational review takes place.
- In Chapter 3, verbiage for operational review cycles for deficient and “at risk” program reviews is added.
- The entire PS is revised to include electronic submission of correspondence to/from the review sites.
- Chapter 4 is added to include the Management Assessment process.
- Chapter 5 is added to include Correctional Standards and Accreditation policy.
- Chapter 6 is added to include Liaison With External Audit Authorities policy.
- A Definitions of Terms summary is included as Attachment A.
- The retention period for program review reports is reduced from eight years to five years.
PROGRAM OBJECTIVES
The expected results of this program are:
- Programs will comply with applicable laws, regulations, policies, and procedures. This includes compliance with the Master Agreement and 5 USC 71 (Labor Management Statute).
- Recommended solutions to problems will be provided to program managers.
- Weaknesses in financial or administrative controls will be identified and corrected.
- Assessments will be made as to how well programs are achieving desired results.
- Efficient management practices will be promoted.
- Program performance will be reported accurately in management and statistical reports.
- The quality of programs will be improved.
- Fraud, waste, abuse, mismanagement, and illegal acts will be prevented, detected, and reported.
- Noteworthy accomplishments of programs will be identified and their recognition and replication promoted (internal bench marking).
- Useful performance indicators will be established to monitor vital programs and operations.
- Each facility will be accredited through ACA within 24 months of activation.
- Each previously accredited facility seeking reaccreditation will be re-accredited through the IRP process.
- Participation of employees throughout the Bureau in ACA sponsored activities will be equitable.
- All proposed CAC Standards will be centrally reviewed for consistency and impact on Bureau operations.
- The Bureau will respond in a timely, accurate, and concise manner to all audits, inquiries, surveys, and requests from audit sources external to the Bureau. All responses will be centrally coordinated and routed through the Program Analysis Section (PAS) prior to submission to the external audit authority.
- All staff interviewed or otherwise contacted by an external audit authority will respond with honesty, credibility, integrity, and within the scope of their knowledge and responsibilities.
- Formal responses to draft or final reports from external audit authorities will be signed by the Director. The PAS is responsible for the coordination and submission of these responses.
- The Bureau will use the results of external audits in a timely manner to learn, develop, and improve its programs and operations.
DIRECTIVES AFFECTED
Directives Rescinded
PS 1210.20 Management Control and Program Review (11/24/99)
PS 1210.19 Liaison with External Audit Authorities (8/28/98)
PS 1290.04 Correctional Standards and Accreditation (4/26/00)
Directives Referenced
PS 1351.04 Release of Information (12/5/96)
PS 4220.05 Design and Construction Procedures (2/15/00)
TRM 1202.02 Management Control and Program Review (11/24/99)
DOJ Order 2860.3A Implementation of the Federal Managers’ Financial Integrity Act (P.L. 97-255), 1986
DOJ Order 2900.5A Responsibilities for the Detection of Waste, Fraud, and Error in Department of Justice Programs, 1986
DOJ Order 2900.6a Audit Follow-Up and Resolution Policy, 1989
OMB Circular A-76 Performance of Commercial Activities, 1983
OMB Circular A-123 Management Accountability and Control, 6/21/95
GAO, Government Auditing Standards, 1994
GAO, Standards for Internal Controls in the Federal Government, 1983
Executive Order 12805, 57 Federal Register 20627 (1992) “Integrity and Efficiency in Federal Programs”
STANDARDS REFERENCED
American Correctional Association 3rd Edition Standards for Adult Correctional Institutions: 3-4003, 3-4012, 3-4018, 3-4019, 3-4036, and 3-4104
American Correctional Association 3rd Edition Standards for Adult Local Detention Facilities: 3-ALDF-1A-03, 3-ALDF-1A-17, 3-ALDF-1A-18, and 3-ALDF-1B-09
American Correctional Association 2nd Edition Standards for Administration of Correctional Agencies: 2-CO-1A-06, 2-CO-1A-07, 2-CO-1A-08, 2-CO-1A-09, 2-CO-1A-20, 2-CO-1A-21, 2-CO-1A-22, and 2-CO-1A-23, 2-CO-1B-07
REQUIREMENTS
Program review is an essential management control tool because it provides timely and essential information on program performance.
Management Controls
The Bureau will maintain a system of management controls that enables managers to:
- Assess program performance regularly.
- Determine the degree of risk.
- Test the adequacy of internal controls.
- Adjust operations to conform with requirements and achieve desired results.
Program Review
The Bureau subjects each of its programs to a thorough examination by organizationally independent, trained Bureau reviewers who are specialists in the program area being reviewed.
Standards for Program Review
The GAO has issued standards for all government audits, which are referred to as “generally accepted government auditing standards.” These standards cover the following areas:
- Auditor qualifications.
- Auditor independence.
- Due professional care or audit quality, including sound professional judgment and standards relating to examination, evaluation, and reporting.
- Quality control, including internal and external reviews.
The Bureau will strive for close adherence to the Standards for Audit of Government Organizations, Programs, Activities, and Functions. To ensure compliance, the Bureau has developed a quality assurance program that provides for continuous evaluation of the program review process. Results are used to prepare the Annual Assurance Report to the Attorney General.
This provides assurance of consistent and effective implementation of the Federal Managers’ Financial Integrity Act (FMFIA) and OMB Circular A-123, Internal Control Systems.
Bureau reviewers are required to assign an overall program performance rating based upon the review’s results. This assists the Executive Staff in making individual and systemwide resource needs determinations.
MANAGEMENT CONTROL SYSTEM
The basic components of management control are: assessing, planning, testing, monitoring, analyzing, and correcting. A brief overview of these components follows, including the “system of assurance” requirements incorporated into each level of the organization and at each stage of the process.
Assessing
For a system of management control to be effective, an in-depth and realistic assessment of all programs is required to determine the degree of “risk” or the need for improvement and to plan a program review system for each specific program or functional area. This is accomplished through a management assessment (described in Chapters 1 and 4), whereby program managers examine each important process or activity cycle of the program from start to finish.
Planning
Periodic management assessments provide a forum in which program managers view their program’s strengths and weaknesses. Areas of weakness are discussed, and action plans are developed to implement good internal controls and ensure improvement. Assistant and regional directors certify through their annual assurance letter to the Director that examination of those processes considered most at risk is included in the program review guidelines (PRGs) and strategic plans have been developed to bring about needed improvement.
Testing/Program Review
Normally, Bureau reviewers conduct reviews, studies, etc., based on the annual program review schedule and within the scope of PRGs. However, if the review is in response to a specific event or special emphasis issue, it may require developing new program review objectives and instructions. In any event, all program reviews must conform to “generally accepted government auditing standards” and this PS’ provisions.
The reviewer-in-charge (RIC) for the program review certifies that, within the scope of the review and except for deficiencies cited, there is reasonable assurance that programs comply with applicable regulations and policies, and internal control systems are effective (detailed procedures for conducting a program review are covered in Chapter 2).
Monitoring
Program monitoring is an extension of the Testing/Program Review component. Monitoring on a continuous or periodic basis (weekly, quarterly, etc.) allows staff to:
- correct problems before they get out of hand,
- track strategic goal accomplishments,
- communicate to other Bureau staff,
- follow-up on actions called for in past program reviews, and
- prepare for upcoming reviews.
Bureau staff at each level of the organization (institution, regional office, Central Office, etc.) establish ways of monitoring the well-being of their respective programs and, in particular, the programs’ vital functions. Management indicators that are linked to program review objectives help the manager define information sources and criteria used for this monitoring.
Analyzing Program Review Findings
At least annually, program managers analyze the results of all reviews, special studies, trend data, and management indicators. Based on this analysis, the PRGs may be updated and reissued.
Additionally, each regional and assistant director prepares a certification letter to the Director stating that control systems for those programs, functional areas, or installations under his or her jurisdiction are operating effectively, except as noted. Wardens make a similar certification to their respective regional directors. The Director, in turn, provides such assurance to the Attorney General no later than October 31 each year.
Correcting
The essence of management control is the action that adjusts operations to conform with requirements. Prior to a program review’s closure, the CEO must give assurance that internal control systems are in place to prevent recurrence of the problems. Such assurance can be obtained through various reviews and monitoring systems (see Chapter 2 for details).
In addition, the appropriate program managers must track actions to correct systemwide problems to ensure scheduled corrective action is being taken, and action is appropriate to improve the situation. Corrective actions may include:
- Development of new or modified PRGs.
- Plans for special studies or reviews.
- Improvement in training programs.
- Changes in policy.
- Monitoring the accomplishment of strategic action plans, etc.
Strategic Management Cycle
A “holistic” approach has been incorporated into the Bureau’s system of management, wherein information from the following sources is used:
- Management assessments.
- Operational reviews.
- Program reviews.
- Social climate surveys.
- Institution character profiles.
- Other information sources (GAO, OIG, new legislative regulations, etc.).
- Information analysis and synthesis (Program Summary Reports, etc.).
- Policy development.
- Formulation of strategic plans and goals.
All of these areas are interdependent and collectively form what is known as a “strategic management cycle.” It is intended that strategic planning be a continuous process, and that the use of review findings, management indicators, and strategic planning objectives/action steps be closely interrelated.
By identifying issues through the program review process, strategic issues are developed to ensure that long-term corrective action is fully implemented. Furthermore, analyzing a program review assists program administrators to develop PRGs which ensure high-quality evaluations.
RESPONSIBILITIES
The following is an outline of the responsibilities involved in the management control and program review system. It is understood that all staff are responsible for compliance with the Master Agreement (or Central Office Agreement) and 5 USC 71. Specific internal control reporting requirements are described in Chapter 1 of this PS.
Director
The Director submits an assurance statement to the Attorney General at the end of each fiscal year certifying that programs are operating effectively and in accordance with applicable law, and that systems of internal control are adequate to protect resources. Material weaknesses and significant concerns in the Bureau’s systems of controls will be identified in the Management Control Plan, including a plan for correction.
The Director approves/signs the responses to final external audit reports.
Assistant Directors
The assistant directors will:
- Determine the need for special reviews or studies in program areas and ensure necessary reviews are conducted accordingly.
- Ensure the results of program reviews, management indicators, management assessments, and other reviews and studies throughout the year are analyzed to determine whether there is a pattern of noncompliance or lack of controls in division programs.
- Ensure appropriate strategic plans are developed to address and correct weaknesses.
- Update and reissue PRGs with the Program Review Division (PRD) senior deputy assistant director (SDAD) for division programs based on the analysis mentioned above, to include the program area’s management indicators for program review objectives.
- Prepare a certification letter to the Director annually, attesting to the adequacy of internal controls in division programs and summarizing major systemwide concerns or weaknesses needing corrective action.
- Ensure policies and procedures issued from all divisions’ programs include reference and language relating to applicable ACA standards.
- Provide expert opinion on proposed ACA standards changes.
- Ensure their respective divisions are fully responsive to requests from external audit activities.
Senior Deputy Assistant Director, PRD
The PRD SDAD is the designated internal control officer for the Bureau. OMB directs that a senior official be given responsibility for coordinating the agencywide effort to comply with the Federal Managers’ Financial Integrity Act (P.L. 97-255). This official also ensures the agency’s methods of assessing the adequacy of internal controls are consistent with this Act’s provisions.
The PRD SDAD not only has oversight authority for the Bureau’s program review program, but also:
- Serves as the review authority for all program reviews.
- Issues an annual program review schedule for all programs and ensures timeliness of program review schedules.
- Develops and updates program review policy.
- Provides program and operational review skills training and technical assistance to reviewers.
- Monitors all reviews and review materials related to the conduct of program reviews, conducts on-site evaluations of reviewers, and provides assistance to ensure program reviews are conducted in compliance with policy and auditing standards.
- Reviews program review objectives and guidelines for completeness and general adherence to accepted formats prescribed in policy.
- Provides systematic analysis and feedback to all levels of the agency related to program reviews.
- Assesses the program review program’s overall effectiveness through a variety of indicators that include the ACA Intensive Reaccreditation Process (IRP) and an annual operational review of PRD.
- Makes recommendations to the Director for improvements in Management Control and Program Review.
- Provides periodic training in management control and the program review process to Bureau managers.
- Ensures the Bureau components and staff cooperate with and respond to all external audit agencies.
- Ensures Executive Staff are kept informed of all external audit activities.
- Serves as the review authority for correspondence with external audit authorities.
- Determines the affected Bureau component(s) upon receipt of external audit notifications.
Regional Directors
Regional directors will:
- Ensure CEOs and regional administrators are fully responsive to program review findings and institutions close program reviews in a timely manner.
- Determine the need for special reviews or studies in specific program areas and ensure necessary reviews are conducted.
- Prepare an annual certification letter to the Director attesting to the adequacy of internal controls in regional programs.
- Ensure strategic issues are developed for regional strategic plans and develop corrective actions to address noncompliance and lack of controls.
- Ensure ACA standards are complied with by assigning a regional ACA manager to provide oversight.
- Ensure CEOs are compliant with their responsibilities related to the ACA accreditation program.
Wardens
The Wardens will:
- Provide full support and cooperation to the reviewers, including freedom of access to all property, records, employees, and inmates.
- Ensure operational reviews of each functional area in the institution are conducted within the time frames established in Chapter 3.
- Provide timely initiation and completion of appropriate corrective action to enable the program review’s closure within prescribed time frames.
- Certify that adequate controls have been implemented or improved to avoid recurrence of deficiencies (see the Management Control and Program Review TRM for sample).
- Provide feedback to regional administrators on their respective discipline guidelines to ensure guidelines adequately measure both the program’s performance and its vital functions.
- Identify issues to be incorporated into the institution’s strategic planning process at least annually; and, when appropriate, establish action plans to address operational and program review findings. Report quarterly on major developments and/or major problems and provide the plans for solving the identified problems.
- Annually prepare a certification letter to the regional director attesting to the adequacy of institution internal controls (see Management Control and Program Review TRM for sample).
- Ensure the institutions’ policies, procedures, and practices are in substantial compliance with the applicable ACA standards during the accreditation period.
- Notify the regional director and the PRD SDAD of unannounced arrivals from external audit authorities.
Central/Regional Office Administrators
Central/regional office administrators will:
- Ensure management assessments are completed within time frames specified in Chapter 1 of this PS.
- Monitor trends and develop strategic plans to address emerging problem areas as part of program evaluation.
- Ensure information from program reviews, management indicators, management assessments, and other studies are analyzed to determine whether there is a pattern of noncompliance or lack of controls in the programs.
- Mentor and train institution department heads to conduct high quality operational review programs and provide feedback on the results of those reviews.
- Identify strategic issues for Central/regional strategic plans and develop corrective actions to address noncompliance and lack of controls as discussed in subsection (3).
Planning and Analysis Administrator
The planning and analysis administrator will:
- Ensure branch staff are responsive to requests from the external audit activities in a timely manner.
- Notify the PRD SDAD of external audit activities.
- Assist with the determination of affected Bureau component(s) upon receipt of notifications.
Program Review Branch (PRB)
PRB staff and administrators will:
- Conduct program reviews for all disciplines.
- Assess how well programs are achieving desired results.
- Coordinate management assessments of each discipline.
- Assist in identifying vital functions.
- Develop review schedule and participant list.
- Co-author review guidelines.
Program Analysis Section (PAS)
PAS analysts will:
- Coordinate an analysis of reviews to determine trends and patterns that are both discipline-specific and cross-disciplinary in nature.
- Assist program administrators and managers with the development and use of management indicators and other informational tools.
- Provide support for the Bureau’s competition advocate by providing analysis of information required for decisions related to competitive procurement. The competition advocate seeks to enhance deficit reduction, avoid wasteful spending, and accrue savings to the Bureau through various competitive strategies which are designed to reduce contract costs.
- Organize the Year-End Management Control report for the Director, which is forwarded to the Attorney General.
- Serve as a liaison for the Bureau’s contacts with external audit agencies such as GAO and OIG.
- Facilitate interaction between external audit authorities and Bureau staff.
- Assist with the determination of affected Bureau component(s) upon receipt of notifications.
- Schedule and arrange all entrance/interim/exit conferences.
- Coordinate all Bureau responses for draft and final reports.
- Monitor closed audits to ensure appropriate and adequate corrective action(s) continue.
- Maintain a permanent file of external audit reports and related correspondence.
- Respond to inquiries from staff of any organizational component contacting the PAS for clarification or assistance with any questions or concerns regarding external audit activities.
- Notify the PRD SDAD immediately of any issues identified during the course of an audit that may generate unusual public concern or be of interest to the media.
Strategic Management Section (SMS)
SMS evaluators will:
- Coordinate the strategic planning process.
- Coordinate all ACA-related activities.
Accreditation Managers
Bureau Accreditation Managers
The Bureau accreditation managers are assigned to the PRD’s SMS. This office is responsible for all agencywide accreditation activities, including but not limited to:
- Serving as the contracting officer’s technical representative for all contracts between the Bureau and the ACA.
- Preparing directives regarding the ACA and the CAC.
- Reviewing all PSs and Change Notices to ensure appropriate use of ACA standards language and ACA citations prior to publication.
- Reviewing all PRGs to ensure appropriate ACA citations prior to publication.
- Providing technical assistance and training in the accreditation process.
- Coordinating accreditation activities for the Central Office.
Central Office Accreditation Managers
The Central Office division accreditation managers are:
- Designated by the division’s assistant director.
- Responsible for maintaining operational and program review files to document compliance with ACA standards.
- Responsible for facilitating Central Office reaccreditation.
Regional Accreditation Managers
The regional accreditation managers are the point of contact for information regarding accreditation within the region and provide oversight for all accreditation activities within the region. Regional accreditation managers are encouraged to attend ACA conferences and the related training both the SMS and ACA offer. The SMS provides funding for participation in training and related activities.
Institution Level Accreditation Managers
The Warden appoints institution accreditation managers to coordinate accreditation matters for the institution. The institution accreditation manager is encouraged to attend SMS and ACA offered training sessions. Training is offered in conjunction with an ACA conference. For those institutions seeking initial accreditation or reaccreditation, SMS provides funding for participation in training and related activities. The institution accreditation manager:
- Chairs the institution accreditation committee while preparing for initial accreditation.
- Coordinates all accreditation related activities, including maintaining program and operational review files to document ongoing compliance with ACA standards for reaccreditation, at the institution.
Coordinate Bureau responses and input to DOJ related to requirements under the Government Performance Results Act.
Coordinate the Bureau’s descriptive input and component performance reporting figures for submission for the DOJ’s Annual Accountability Report.
Develop and monitor baselines for reengineering initiatives approved by Executive Staff.
/s/
Kathleen Hawk Sawyer, Director
DEVELOPING AN INTERNAL CONTROL PROGRAM
INTRODUCTION
The Federal Managers Financial Integrity Act (P.L. 97-255), passed in 1982, mandated that all Federal agencies develop an internal control program to prevent waste, loss, unauthorized use, or misappropriation. This Act reinforces the requirement that individual managers are responsible for the successful operation of controls in the programs they manage.
OMB Circular A-123 prescribes the policies and standards to be followed in establishing, maintaining, reviewing, and reporting on internal controls. Additionally, GAO has provided standards to be followed in carrying out the internal control process.
In practical terms, this Act requires the Bureau to apply and review its methods of internal control and report the results annually to the Attorney General.
STRATEGIC MANAGEMENT CYCLE
A strategic management cycle has been developed that incorporates the concept of continuous planning through:
- Management assessments.
- Operational reviews.
- Program reviews.
- Social climate surveys.
- Institution character profiles.
- Other information sources (GAO, OIG, new legislative regulations, etc.).
- Information analysis and synthesis (Program Summary Reports, etc.).
- Policy development.
- Formulation of strategic plans and goals.
Managers at all organization levels will use these events to gather, monitor, analyze, and synthesize information that will aid them in assessing their respective programs.
MANAGEMENT ASSESSMENT
A management assessment is a systematic method of assessing the strengths and weaknesses of a particular program/activity and developing monitoring tools to improve those areas. Furthermore, it provides the opportunity for the identification of strategic issues that may ultimately become part of the program’s or Bureau’s strategic plan. (See Chapter 4)
STRATEGIC ISSUES
Strategic issues arise from a variety of sources, internally (Executive Staff, management assessments, etc.) and externally (Congress, Department of Justice, etc.). These issues are then reviewed by the Executive Staff for possible inclusion in the Bureau’s strategic plan. The Executive Staff also determines which Bureau issues, if any, are reported to the Department of Justice as a material weakness or significant concern (refer to Section 5 for an explanation of these).
Strategic planning requires a high level of staff involvement, and the Bureau encourages staff at all levels to have input into the national strategic planning process. Staff who are performing the work best understand what is required to accomplish it. Additionally, when staff are involved in determining what needs to be performed, they are more committed to accomplishing the planned actions.
MATERIAL WEAKNESSES/SIGNIFICANT CONCERN
Strategic issues that have impact outside the Bureau may be referred to the Executive Staff for review. If the Executive Staff agrees, the issue will be reported to the Department of Justice through the management control plan. The management control plan identifies material weaknesses and significant concerns, and details corrective actions and target dates for completing those actions.
The criteria for material weaknesses and significant concerns are:
Material Weakness Criteria
- Significantly impairs the fulfillment of an agency or component’s mission.
- Deprives the public of needed services.
- Violates statutory and regulatory requirements.
- Significantly weakens safeguards against waste, loss, unauthorized use, or misappropriation of funds, property, or other assets.
- Results in a conflict of interest.
- Merits the attention of the agency head/senior management, the Executive Office of the President, or the relevant Congressional oversight committee.
- Their omission from the report could reflect adversely on the management integrity of the agency.
Significant Concern
- Is a control deficiency of significant importance having Bureauwide impact to be reported to the Attorney General.
- If the deficiency is not corrected, it could develop into a material weakness.
ANNUAL ASSURANCE STATEMENTS
Each year, the Director is required to submit an assurance statement to the Attorney General relating that the Bureau’s system of controls are operating as intended by the FMFIA.
FMFIA requires that each Federal agency establish, maintain, evaluate, improve, and report on internal controls in its program and administrative areas. All levels of management are involved in ensuring the adequacy of internal controls.
By September 15 each year, Wardens will submit assurance statements to their respective regional directors. The statement will indicate if existing and new program activities at the site location are being managed effectively and efficiently to achieve the agency’s goals. The Wardens will provide reasonable assurance that government resources are protected against fraud, mismanagement, or misappropriation. There is no requirement for assurance statements at the institution department level, or for bargaining unit staff to sign such statements, when they relate solely to program reviews.
By October 1 of each year, assistant and regional directors will submit an assurance statement to the Director with a copy to the PRD SDAD.
CONDUCTING A PROGRAM REVIEW
OVERVIEW
All program reviews must conform to the standards for auditing established in the Government Auditing Standards and the provisions of this PS. Planning, conducting, and analyzing program review results should be done within the context of a system of management control.
Requirements (Extent, Frequency)
Each program or operation at each Bureau installation will be reviewed comprehensively in accordance with published PRGs. Institution, community corrections (field), regional transitional drug abuse treatment (TDAT), oversight function of privatized facilities, and Central Office programs that receive a superior or good rating are to be reviewed on a three-year basis. Regional program areas (with the exception of TDAT) that receive superior or good ratings are to be reviewed every five years.
Programs that receive acceptable ratings are to be reviewed on a two-year basis, and programs receiving deficient ratings are to be reviewed at 18-month intervals. “At risk” programs are to be reviewed upon request for closure. New institutions will be reviewed beginning 18 to 24 months after activation.
Regional office program reviews for those disciplines without an operational function will be accomplished from the Central Office via phone interviews and paper review. On-site regional office program reviews will be conducted for those disciplines with an operational function (Financial Management, HRM, Computer Services, Community Corrections, Facilities Management, and ISM).
The PRD SDAD must approve exceptions to this review cycle.
Program Reviews
This PS’ provisions apply to reviews conducted in a variety of situations. These reviews are intended to determine:
- compliance with applicable regulations and policies,
- adequacy of internal controls, and
- the effectiveness, efficiency, and quality of programs and operations.
Selection of Field Staff for Program Review Teams
The use of field participants as program reviewers is a cost-effective practice that supports the program review process and enhances the staff member’s professional development. Nominations for discipline experts are requested by the review authority from institutions and other field locations annually. These requests are made to the CEO. Bargaining unit members selected as discipline experts may request not to participate. Management shall consider such requests. Employees will be notified that their name is being submitted for consideration.
The PRD SDAD selects nominated staff for program review teams. These team selections are based primarily on cost effectiveness for travel to the review site and any special skills that might be required for the review. The team assignments are included in the annual program review schedule that is distributed prior to the beginning of each fiscal year.
If at any time, after distribution of the program review schedule, a team member’s duty station needs to remove an assigned participant from a program review, the assigned team member’s CEO must submit a request via BOPNet GroupWise to the PRD SDAD requesting the participant’s removal from the assigned review team.
Reviewer-In-Charge (RIC)
Each program review must have one RIC, who is appointed or approved by the PRD SDAD. The RIC will report findings and must ensure:
- Reviews are conducted in accordance with this PS’ provisions.
- Program review objectives are met within the scope of the review plan.
- Findings and recommendations are presented in a written report.
- Working papers adequately support review findings.
- Team members (reviewers) receive appropriate guidance and supervision.
- An overall rating is provided as part of each program review.
- Appropriate management officials are kept fully advised of the review’s results.
The RIC also serves as on-site liaison and monitor of the ACA auditor during IRP audits.
Due Professional Care
Due professional care must be used in conducting the review and preparing reports. This includes:
- Using good judgment in conducting the review, assessing the findings, and preparing the report.
- Following up on findings from previous reviews to determine whether appropriate corrective actions have been taken.
- Adhering to timeliness prescribed by policy.
- Ensuring sensitive information is safeguarded.
Scope of the Review
The extent and focus of the review, as well as reporting any impairments to its effectiveness and integrity, are governed by the following provisions:
No Constraints
Reviewers must attempt to remain within the scope of the specific review objectives for efficient use of resources and to help focus their attention. However, they are not constrained from examining other areas based on the evidence being examined or observations made at the review site.
Reviewer Access
Personnel at the review site must:
- grant reviewers access to all documents that need to be examined,
- permit reviewers to interview employees and inmates who are reasonably available, and
- allow reviewers to inspect all areas and items of government property.
Scope Impairments
If factors restrict the scope of the review, limit the reviewer’s access, or interfere with the reviewer’s ability to form objective opinions and conclusions, the RIC will attempt to resolve the problem informally. Failing that, the RIC will report the problem to the PRD SDAD. The RIC will document impediments in the working papers.
Phases of the Program Review
There are five interrelated phases to any review:
- preparation,
- examination,
- evaluation,
- reporting, and
- follow-up.
There are standards, principles, and procedures for each phase and all reviewers must have a complete understanding of these. The five phases are not mutually exclusive, nor does one phase follow directly after another.
- Preparation. Collecting and assessing data prior to arrival at the review site to help focus on the program review objectives.
- Examination. Collecting evidence, usually at the review site, which includes determining whether the evidence is sufficient, reliable, and relevant.
- Evaluation. Assessing the evidence for deficiencies or need for improvement, and organizing the evidence into the elements of a finding.
- Reporting. Developing findings for presentation at closeout and in writing via the final report.
- Follow-up. Evaluating the facility’s response, monitoring corrective action, seeking resolution of any disagreements, and obtaining closure of the review.
PREPARATION FOR THE REVIEW
This section describes the requirements of the review’s preparation phase. It encompasses all the work and data gathering prior to arrival at the review site. Adequate preparation is important to ensure the program review results satisfy the review objectives (Chapter 1). The following represents the steps that are involved in preparing for the on-site examination.
Data Collection and Pre-Assessment
The reviewer will assess the situation at the specific review site prior to arrival by obtaining and reviewing all pertinent data, including management indicators. This information and the reviewer’s written assessment of it represent the first working papers collected or prepared for the program review. These papers (or a synopsis) will be placed in the review file for reference. Results of this pre-assessment may necessitate adjustments to the program review objectives. The pre-assessment will include:
Phone/E-Mail Contacts
The RIC will contact the department head(s), associate warden, warden, regional administrator, and Central Office administrator(s) to gather any pertinent information.
Events
Recent events, such as a major incident, new department head, or change in mission, will be taken into consideration.
Trends
Workload and performance data will be reviewed to determine any recent trends. The data might include:
- number and nature of inmate incidents,
- staff vacancies and turnover,
- minority hiring,
- recognition awards,
- accidents,
- staff and inmate grievances,
- investigations,
- inmate disciplinary actions,
- class waiting lists,
- course completions,
- inmates employed,
- medical duty status,
- custody levels,
- security level versus crowding, and
- staffing.
Other Significant Data
Other information sources, such as KI/SSS, external agency reports (GAO, OIG, ACA, etc.) will be reviewed.
Past Program/Operational Reviews
Review any recent program/operational reviews of the site and the status of pending corrective actions.
Developing a Site Plan for the Review Site
The RIC will develop a brief written Program Review Site Plan for the specific review site. The plan will include:
- A summary of the pre-assessment and where deficiencies might be expected based on what has been found in the background information and other indicators.
- The general scope of the program review including the specific guidelines to be used and prior review ratings.
- Review dates, suggested team members, reviewer days, cost containment information, and other logistical information.
- Comments from the CEO, department head(s), associate warden, regional administrator, and Central Office administrator(s).
The site plan will be in the form of a memorandum from the RIC to the review authority for approval. If unusual conditions exist, the RIC will meet with the review authority to discuss the planned review.
Notifying the CEO
The review authority will send official written notification via BOPNet GroupWise to the review site CEO at least 30 calendar days prior to the review. The CEO will provide a copy of this to the local union president.
Contents
The notification will contain:
- dates of the review;
- names, titles, and duty stations of the RIC and reviewers;
- scope of the review and program area(s);
- type of review;
- special focus areas, if any;
- program review objectives if different from those published for the program;
- requests for advance materials; and
- a request that the CEO respond if he/she has anything they would like the review team to take into consideration. Upon receiving this notice, the local union president may submit any items they have concerns with to the RIC.
Unannounced Program Reviews
The review authority reserves the right to conduct reviews without prior notification if deemed necessary to achieve reasonable assurance that a site/program is operating in accordance with applicable law and policy, and property and resources are efficiently used and adequately safeguarded.
Intensive Reaccreditation Audits
When program reviews also serve to accomplish the IRP process, the review authority will notify the CEO that the review team will be accompanied by an ACA auditor.
EXAMINATION
The examination phase involves the data collection, interviews, and observations conducted as part of the review process. The following section outlines the steps, procedures, principles, and tools required in this phase of the review.
Organization and Supervision
Organizing the Program Review Work
Prior to beginning the work, the RIC will meet with program review team members and brief them on the plan, including the division of labor, time frames, objectives, and review and sampling techniques. The review is to be organized to ensure no unnecessary demands are placed on institution staff. In the case of an IRP, the RIC is to include the ACA auditor in this briefing and explain the auditor’s role in the program review process.
Giving Due Consideration
The department head must be afforded the opportunity to be fully involved in the review activities. The RIC is to inform the department head and staff that all comments which might alter findings and recommendations or provide information concerning the cause of a deficiency will be fully investigated and given due consideration. The reviewers must work with the department head and staff to find causes and solutions.
Lines of Communication
The RIC is to arrange with the department head precisely how reviewer requests for information and feedback on concerns will be handled. The RIC is to meet daily with the appropriate management staff such as the department head and associate warden to discuss progress and preliminary findings. The CEO is encouraged to participate in the daily closeouts to be fully apprised of the findings.
Supervising the Program Review Team
Proper supervision of team members must be exercised from the beginning of the review through final closeout.
Evidence
During the examination phase, information is discovered and gathered. This is considered evidence that will support the conclusions contained in the final report.
Types of Evidence
Evidence may be categorized as one of the following:
Physical (direct observation of people, property or processes)
This is considered the most dependable type of evidence, and is essential in determining the adequacy of internal controls. Reviewers will allow sufficient time during the review to observe all important procedures actually in operation and determine their efficiency and effectiveness.
Testimonial (interviews)
While extremely valuable, this is considered the least dependable type of evidence, and information thus obtained requires corroboration before it can be used in support of a finding.
Documentary (files, records, invoices, etc.)
This is an excellent method of verifying the reliability of evidence gained through other methods; however, reviewers should not spend an inordinate amount of time reviewing files and records to the exclusion of observation, interviews, and analysis.
Analytical (developed by making judgments about other forms of evidence through computations, reasoning, comparison, etc.)
This is used to conduct staff complement analyses, calculate vacancy rates, etc. Reviewers will allow sufficient time to conduct such analyses. A well-developed finding and a well-written program review report should contain the results of numerous analyses to give the reader a better perspective.
Standards of Evidence
Evidence must meet three standards to be considered in the program review findings. It must be sufficient, competent, and relevant.
Sufficient
There must be enough factual and convincing evidence to lead a knowledgeable, reasonable person who is not an expert in the program area to the same conclusion as the reviewer. Determining the adequacy of evidence requires judgment, especially when there is conflicting evidence. Sufficient evidence is needed to back up the conclusion. Sampling sizes for examinations, observations, and interviews will be sufficient to give the reviewer reasonable assurance that adequate controls are in place.
Competent/Reliable
The evidence must be reliable and the best that can be obtained through using reasonable program review methods. If there is any reason to question its validity or completeness, additional measures must be taken to authenticate.
Relevant
The evidence must be linked to the program review objectives and have a logical, sensible relationship to the issue being proved or disproved.
Serious or Unusual Problems
There may be situations when problems are so pervasive or serious that reviewers will find it necessary to halt the review or drastically redirect its work.
Approval
The RIC will discuss the matter with the CEO and the review authority. The review authority has final authority on whether the program review should be halted or redirected.
Sufficient Evidence for Report
Before a review can be halted, the RIC must ensure sufficient evidence has been gathered to prepare a report of major findings if required. Ending a review or redirecting it prior to completing the entire scope of the review does not necessarily relieve the RIC from preparing a program review report and documenting the reasons in accordance with this PS’ provisions.
Fraud, Abuse, and Illegal Acts
Reviewers must be alert to situations or transactions that could be indicative of fraud, abuse, and illegal acts. Any such evidence or information will be reported to the CEO and review authority immediately for possible referral to the Office of Internal Affairs and follow-up investigation. Similar accusations concerning the CEO must be reported directly to the review authority.
The review authority is to determine whether the review team should continue with the program review or suspend the review until the investigation is completed.
Working Papers
Standard
A written record of the reviewers’ work is to be retained in the form of working papers. It should be possible for a knowledgeable person, not involved with the program review, to review the working papers and arrive at the same general conclusions as the reviewers.
Purpose
Working papers provide a systematic record of the work done by a reviewer or team and contain the information and evidence necessary to support the findings and recommendations presented in the program review report.
Types
Working papers are of various types. Technically, all the information reviewed in preparing for the program review is considered working papers, as are notes taken during interviews, observations, photographs, and reviews of documents. (This includes computer printouts, logs, files, etc.) In addition, any analyses or computations done to support findings are part of the working papers. The reviewers may also develop checklists or worksheets to facilitate the review work and ensure it is conducted efficiently. Checklists are developed from discipline guidelines and may focus on areas of special emphasis. The checklists are also shared with regional and Central Office administrators to ensure they are aware of the checklists’ use and focus.
Program Review File
A file is to be established for each program review, with the original working papers placed in the file. The department head or associate warden must initial each deficiency and advised item marked in the working papers, acknowledging their review of the evidence. The working papers are to be placed in a file that would facilitate their use and prevent loss or mutilation. The file’s contents are to be identified clearly (review site, program area, dates).
Retention
The review authority is to retain program review working papers for at least five years from the ending date of the review. PRD will retain working papers electronically. Working papers files maintained prior to the implementation of electronic filing will be retained for one complete review cycle in the PRD files, and the remaining records are to be archived in accordance with government regulations. Working papers must be destroyed at the end of this period unless specific reasons are presented for their retention.
Team Members’ Papers
Only one program review file and set of supporting documents are to be maintained. The RIC is to collect all working papers from team members for inclusion.
Format
Each reviewer has a personal style of recording and collecting information. This PS is not intended to impose a rigid, standard format for working papers, nor should the development of working papers impose extra work for the reviewer disproportionate to the value of the evidence. However, at a minimum, working papers are to be:
- Complete and accurate to provide proper support for the program review conclusions.
- Clear, concise, and understandable.
- Legible and neat, even though usually handwritten.
- Restricted to matters that are materially important and relevant to the program review objectives.
Forms
In addition to the preprinted checklists and interview sheets that reviewers normally use, it is suggested that each reviewer have a supply of working paper forms to record information collected during the review.
Program Review Interviews
This is a crucial part of the examination phase of a program review. There are three types of interviews: entrance interview with CEO, discovery/confirmation interviews with staff and inmates, and exit interview/closeout with CEO.
Entrance Interview
Upon arrival at the review site, the reviewers will meet with the CEO and any other personnel the CEO may wish to have present.
Purpose
At this meeting, the RIC will define the scope of the review, and briefly describes how it will be organized to cause as little disruption to the facility as possible. The RIC will also clarify how the CEO prefers the team respond to an institution emergency.
Cluster
If the review is being conducted in conjunction with other discipline reviews, each RIC will attend the entrance interview.
Closeout Schedule
A time for the daily closeouts must be established during this meeting. The final closeout time and date will be established later in the review week.
Discovery/Confirmation Interview
Normally, reviewers must interview a sufficient sample of staff and, depending upon the discipline, inmates during the course of the review, based on the program review objectives as well as on evidence discovered during the course of the review.
Furthermore, it is the RIC’s responsibility to conduct interviews of staff and inmates that measure the climate of the department being reviewed. This includes an interview with the local union president or his or her designee. The interviews seek information regarding safety/security, communications, staff and inmate morale, and staff responsiveness. This information is summarized and reported to the CEO prior to the final closeout.
It is inappropriate to use recording equipment in a program review interview. The reviewer will record significant information gathered based on notes taken and impressions. The interview outline and notes are considered part of the official working papers. The actual notes are considered confidential and will not be disclosed.
Daily and Final Close-Outs
Daily, each reviewer will discuss any apparent discrepancies with the person being reviewed at the time these apparent discrepancies are found. During the review week, the RIC will meet daily with the department head, associate warden, and Warden to review the progress and discuss any deficiencies or findings. These closeouts provide the institution staff and the RIC an opportunity to discuss the review and clarify any issue that is raised during the course of the review.
At the conclusion of the review, the reviewers will meet with the CEO and any staff the CEO wishes to have present to apprise them of the results, including any significant findings, deficiencies, or significant lack of administrative controls.
A draft of the findings and a preliminary overall program rating will be given to the CEO prior to the conclusion of the closeout. If other major deficiencies are later discovered through review of working papers or additional discussions with other team members, the RIC will discuss them with the review authority and CEO prior to releasing the program review report.
If the final overall rating differs from the preliminary rating provided the CEO during the close-out, the RIC will also discuss this with the CEO prior to releasing the program review report.
EVALUATION
The evaluation phase of a program review is ongoing from the time pre-assessment information is collected prior to arrival at the review site, through the examination and closeout, to the preparation of the program review report. The reviewers make judgments about every document examined, every interview conducted, and every observation made to determine if a piece of evidence may link or relate to other evidence gathered.
To emphasize its importance, the evaluation phase is presented as a separate phase and is focused on the work of the reviewers as they begin organizing evidence into findings, when appropriate. The evidence should have been assessed for its sufficiency, reliability, and relevance.
Purpose
During the evaluation phase, reviewers analyze evidence for indications of patterns, trends, interrelationships, common causes and effects of the problems on the program, and innovative methods to improve operations.
Organizing Evidence into Findings
To ensure evidence is presented in a manner that will be most useful to management, the evidence, if indicative of a serious problem, must be organized into a “finding” or series of findings.
Materiality
Materiality of deficiencies and whether they need to be placed in the official report (rather than handled verbally or placed on the advised list) is the RIC’s judgment based on available evidence, extent of the problem, risk to the program’s efficient and effective management, program review objectives, etc. The following points provide some guidance when determining whether deficiencies represent a significant finding:
- Importance to the accomplishment of the mission and vital functions of the program, the institution, or the Bureau.
- Pervasiveness of the condition (isolated or widespread). A single example of a deficiency is normally not sufficient to support a broad conclusion or recommendation.
- Indication of fraud, waste, abuse, or illegal acts (or anything that might constitute a conduct issue).
- Extent of the deficiency (based on allowable deviation from what is expected).
- Importance to the maintenance of adequate controls, such as a pattern of small, related discrepancies, which by themselves would not warrant mention, but taken together could be detrimental to the program.
Commendations
As a result of the analysis of the evidence, reviewers may report that exceptional progress has been made in a program area or a solution has been implemented to resolve a significant problem.
Deficiencies
Reviewers may investigate and report on any significant problems, failings, weaknesses, and need for improvement. The term “deficiency” is used to describe any such concern and includes, but is not limited to:
- Deviations from policy or regulation.
- Weaknesses in internal controls.
- Lack of quality controls.
- Failure to observe accepted standards of practice for a particular profession.
- Lack of operating efficiency.
- Failure to meet program objectives.
- Noncompliance with a mandatory ACA standard.
Elements of a Significant Finding
A well-developed significant finding contains the following elements:
Condition
What was found, the extent of the problem related to the number of cases examined, interviews conducted, etc. There can be only one condition in a significant finding; however, a significant finding may be based on one or more deficiencies or needs for improvement.
These deficiencies can be combined into a single significant finding, if they are all related to the same activity and program review objective or if the cause and effect for each is approximately the same. The intent is that deficiencies are not listed as isolated, unprioritized events.
Example: Evidence (documentary, testimonial, physical, analytical, can include many noted problems, etc.): “Observed two unauthorized staff members enter the mailroom, door left open on one occasion, mail delivery not within 24 hours based on staff interviews, unusually large number of lost mail claims, high staff turnover in the mailroom.”
Condition (only one): “Lack of adequate controls in the operation of the mailroom.”
Condition/Criteria
What should be, based on policy, regulation, law, generally accepted practice, desirable administrative or internal controls, quality controls, program objectives, efficient operations, etc. The reviewer will be aware of policy compliance exemptions granted to the review site.
Effect
What effect the condition is already having or what will probably happen if the condition is not corrected; that is, how significant the finding is in terms of attainment of the program’s objectives and the review site’s mission. This is also known as the “materiality” of the condition.
Example (based on previous example): Result of condition: “unauthorized access, late delivery of mail, lost mail.”
Potential result if not corrected: “fraud involving inmate monies, loss of confidentiality of sensitive materials.”
Cause
Why the condition happened, if known. The condition is only the symptom; the RIC, after receiving input from the reviewer(s), must determine the underlying cause(s) of the condition, or at least some probable causes, to be of most benefit to management.
Example (based on previous example): Why did the condition happen? “probably because of high staff turnover, lack of adequate training, lack of adequate, detailed local procedures.”
Recommendations
This section details possible solutions to the significant finding. The recommendations should be attainable by the staff and take into consideration available staff and resources.
Example: “staff should review local procedures to ensure compliance with current policy; additional training should be provided for staff.”
OVERALL RATING
Because of the great amount of information derived from program review findings, the Executive Staff determined that there was a need for a concise system of summarizing information from the program review reports. The assignment of an overall rating meets this need. The preliminary rating reflects the RIC’s overall judgment as to how well the program area’s mission and objectives are accomplished.
The rating is determined by a careful evaluation of how well the functions identified in the discipline guidelines are being performed. Further, the rating is a measure of the program’s performance and is not directly related to the program manager’s performance. The assignment of the rating is also intended to measure the program’s performance over time. The review authority assigns/approves the final rating. The following terms and definitions are used:
Superior
The program demonstrates exceptional effort and initiative, setting a standard for the discipline. The program is performing all vital functions in a manner that exceeds discipline national targets and goals. A history of strong internal controls exists resulting in zero or very minimal deficiencies, full compliance with all ACA mandatory standards, and no repeat deficiencies. In addition, the program demonstrates excellent teamwork, communication, and sense of ownership.
Good
The program vital function areas are sound. Internal controls are strong and there are zero or limited procedural deficiencies. Overall program performance reflects positive professional and technical expertise. The program is in full compliance with all ACA mandatory standards. Good teamwork, communication, and sense of ownership have allowed for positive initiatives. The program meets discipline targets and goals, and demonstrates growth and/or strengths.
Acceptable
This is the “baseline” for the rating system, and each program is assumed to be performing at this level at the beginning of the review. Although deficiencies may exist, they do not detract from the adequate accomplishment of the vital functions or compromise compliance with mandatory ACA standards. Internal controls are such that there are no performance breakdowns that would keep the program from continuing to accomplish the mission. The program will receive no higher than an acceptable rating when a significant finding(s) exists.
Deficient
One or more vital functions of the program are not being performed at an acceptable level. Internal controls are weak, thus allowing for serious deficiencies in one or more program areas. A program will receive no higher than a deficient rating when a repeat repeat deficiency(ies) exists, indicating a problem has occurred in the program area at least three times.
If a program demonstrates noncompliance with a mandatory ACA standard the program will receive no higher than a deficient rating. A program will receive no higher than a deficient rating when a significant finding(s) in a vital function area exists.
At Risk
The program is impaired to the point that it is not presently accomplishing its overall mission. Internal controls do not demonstrate substantial continued compliance and are not sufficient to reasonably assure acceptable performance can be expected in the future.
In arriving at these ratings, the discipline’s complexity or degree of difficulty is taken into consideration.
THE PROGRAM REVIEW REPORT
Written program review reports are required. The only official report to which the CEO must respond and take action is the one written and presented to the review authority for review and transmittal to the CEO. Because the system allows for challenges to deficiencies and significant findings, the program review report may only be considered final upon review closure. The timetables for this process are established within this PS.
Fairness and Accuracy
The reviewer will place deficiencies or noteworthy accomplishments into perspective and avoid exaggeration. Only information adequately supported by sufficient evidence in the working papers can be included in the report. This information must be reliable, sufficient, and logically presented to illustrate the impact or potential impact of the deficiency.
Critical comments will be presented in a balanced perspective, taking into consideration any unusual difficulties or circumstances the review site faces.
Clarity
Reports must be clear, concise, and substantive. Conclusions will be specific, not left to inference. Aside from department heads and program administrators, readers will have varying perspectives (institutional, regional, and systemwide) and may not have a background in the program area being reviewed. Therefore, technical terminology is to be avoided whenever possible.
Credit
The reviewer must give credit when institution management has already noted a problem and is taking steps to correct it or is actively searching for solutions. It should be noted that problems identified by technical assistance visits and recently conducted operational reviews may be listed as findings or deficiencies within the program review report if corrective action has not been taken, and/or controls have not been in place for a specified period (ordinarily six months) to ensure they are effective. Repeat significant findings and repeat deficiencies cited in the program review report will be based on findings from the prior program reviews.
Quality Assurance
The RIC is to establish and maintain a quality assurance program to provide reasonable assurance that program review work conforms with GAO auditing standards and with this PS.
Quality Control Review
The reviewer is to conduct a quality control review prior to submitting the final report to the review authority and must document for the file and within the report that the review was conducted.
Components
The RIC will ensure:
- Review findings are fully supported by sufficient, reliable, and relevant evidence rather than by evidence of minor deficiencies or examination of irrelevant or insignificant matters.
- Program review objectives have been met.
- Review team members were supervised properly and their work reviewed.
- Review findings can be traced to the working papers to ensure they are supported fully and documented, and that figures used in the report are accurate.
- Interim meetings have been held regularly with the department head and associate warden to keep them apprised.
Timeliness
Program review reports must be issued promptly in accord with this PS.
To the Review Authority
The RIC is to prepare the written report and submit it to the review authority within 30 calendar days after the end of the review.
Review by Review Authority
The review authority is to review the report to ensure compliance with the provisions of this PS and standards of auditing. Within 10 calendar days after the review authority receives it, the report is to be transmitted to the review site’s CEO electronically. A signed copy of the report is to be maintained in the working papers.
Distribution
Copies of the program review report and cover memorandum are to be routed electronically to the respective assistant director, regional director, CEO, regional program administrator, and Central Office program administrator.
Retention
The review authority is to retain the program review report for five years, in accord with the provisions of the National Archives and Records Administration, General Records Schedules (Number 22).
Release Provisions
The appropriate method for an outside party to request a program review report or related working papers, management assessment/risk analysis documentation, PRGs, or any other agency record of the Bureau is to make a request in writing to:
Director, Bureau of Prisons
Attention: Office of General Counsel
Freedom of Information Act/Privacy (FOIA/PA)
320 First Street NW
Washington DC 20534
The FOIA/PA Section will coordinate responses to requests for program review reports and related papers with PRD. A program review report or any related supporting evidence is not considered releasable until the review authority closes the review officially.
Separate Reports
If a separate report containing confidential information is being issued, this should be stated in the report and cover memorandum.
Reviewing by Exception
Reporting the results of a program review is governed by the principle of “reviewing by exception.” This principle is used throughout the auditing community; it means that if an area, component, or issue is within the scope of the program review and is not mentioned in the report, the reader can assume that no serious or significant deficiencies or need for improvement were found in that area. It is not necessary for the reviewer to recap every area examined during the review.
Program Review Report Format
The following format will be used for the program review report:
Cover Memorandum
Each report must be accompanied by a memorandum from the review authority to the review site CEO. The memorandum, usually no more than one or two pages, should indicate briefly:
- the scope of the review,
- the overall assessment,
- the number of any significant findings, if any,
- the number of any repeat significant findings, if any,
- the number of repeat deficiencies, if any, and
- the number of repeat repeat deficiencies, if any.
The memorandum will indicate specific response instructions concerning time requirements for responding to general comments, deficiencies, repeat deficiencies, and significant findings.
Program Reviewer Assurance Statement and Signature
This is a statement the RIC signs and dates that he or she has reasonable assurance that:
- The review was conducted in accordance with generally accepted government auditing standards.
- The findings of noncompliance with policy or inadequate controls contained in the program review report are supported by evidence that is sufficient and reliable.
- Findings of noteworthy accomplishments are supported by sufficient and reliable evidence.
- Within the scope of the review, the program is operating in accordance with applicable law and policy; and property and resources are used efficiently and safeguarded adequately, except for the deficiencies noted in the report and in the list of advised items that are supported and documented in the working papers.
- The name, title, and duty station of the other members of the review team will be placed directly under the assurance statement.
Lack of Assurance
If conditions found during the review indicate widespread lack of policy compliance or inadequate administrative controls, thus preventing the RIC from making the assurance statement, the RIC must state and explain this clearly in this section of the report. It must also be emphasized in the review authority’s cover memorandum, and special follow-up measures will be outlined.
The RIC may also be prevented from making the assurance statement because the scope of the review was impaired, unlimited access was not granted, or some event caused the review team to leave the review incomplete through no fault of the reviewers or individuals under review. This must be explained in this section and in the cover memorandum.
Background
This is a brief statement of facts describing the review site, gender of population, operational review dates, staffing pattern, program description, personnel in charge, recent events, etc. This information will reflect the current information available during the review week.
General Comments
This section is open-ended and can be used for different purposes. It is not intended to be used for long lists of recommendations or suggestions to correct less important deficiencies that are not related to a significant finding. Such recommendations should be handled by giving the department head a separate list of items needing attention. Some purposes of this section include:
- Discussion of the rating for the review.
- Discussion of any issues that may require a specific response.
- Discussion of any issues and questions needing further study and consideration on a broader-based scale, such as possible changes to Bureau policy or training courses.
- Observation of areas not directly related to the program or discipline being reviewed.
- Summary of specific issues the review authority wants covered in every program review or in certain reviews.
- Response to the CEO’s request that a specific issue be examined.
- Discussion of any innovative practices that were observed during the review week.
Significant Findings
This section describes any significant findings based on the evidence gathered. The reader must be able to determine how the various deficiencies relate to one another and what impact the deficiencies are having or will have on the program.
Findings Format
Significant findings must be numbered and normally relate to a specific program review objective. They must follow this format:
Heading: Describes the program area or topic involved. It must be meaningful to the reader.
Condition and Effect: A brief one or two sentence opening labeled “Condition and Effect” that informs the reader what the basic condition is and what basic effect it is having on the operation (or probable effect it will have if not corrected).
Evidence Section: This is the heart of the finding and is labeled “Evidence.” It is a brief but persuasive presentation of the pertinent, important evidence. It will note the extent and significance of problems and will be measured against what should be the criteria. It must be concise but informative, giving the reader the facts supporting the finding in an organized manner. Any deviations from policy, regulation, or ACA standards that have a direct relationship to the problem may be listed in this section or in “Other Deficiencies.”
Cause: This is the underlying reason that the condition exists. Common causes include lack of training, lack of resources, inattention or negligence, inadequate or unclear guidance/policy, poor physical plant, etc.
Recommendations: These are actions the RIC presents to the CEO to correct, or lessen the impact of the conditions noted in the significant finding. All significant findings will include realistic recommendations. Reviewers will take the time needed to present recommendations that are clear, cost-effective, and address the conditions and causes.
Further Study
Every significant finding will have a corresponding recommendation; however, there may be situations when neither the cause nor the solution or recommendation is apparent. Then, the “recommendation” may be to study the problem further, perhaps at the regional or national level.
Workable Solutions
Various solutions will be discussed with the department head, regional administrator, associate warden, and, when appropriate, the person reviewed to ensure the solution (or series of options) eventually presented to the CEO at the closeout and in the written program review report will be realistic.
Interim Solutions
The reviewer will be alert to innovative procedures or ways to improve operations that can correct or at least partially correct the situation – even if the basic cause is lack of resources, staff, or space.
Deviations from Policy/Regulation
Although recommendations that require compliance with policy, regulations, or ACA standards are generally non-negotiable, a simple statement of compliance with policy is not adequate. The reviewer will specify the measures required to fully correct or improve the condition stated in the finding.
Repeat Significant Finding
A repeat significant finding is a finding listed on the current review that was also listed during a previous formal review. While a repeat significant finding occurs infrequently, it should be noted that it does not have to be a mirror image of the previous finding.
Different evidence may be used to indicate a component weakness that was found during the previous review. Repeat significant findings will be developed from the prior program reviews, not operational reviews.
Repeat Repeat Deficiencies
A list of current deficiencies also listed as deficiencies during the last program review and prior program review(s). The CEO will be instructed, in the review authority’s cover memorandum, to explain why corrective action was not taken or was not effective prior to the review and what specific controls will be implemented to ensure deficiencies do not recur.
Repeat Deficiencies
A list of current deficiencies also listed as deficiencies during the last program review. The CEO will be instructed, in the review authority’s cover memorandum, to explain why corrective action was not taken or was not effective prior to the review and what specific controls will be implemented to ensure deficiencies do not recur.
When several operations become a shared service, the deficiencies from each operation’s prior review will be considered as potential repeat deficiencies. The shared service review will not be considered a first time review.
Commendations
Programs, procedures, or management practices identified as innovative, which involve cost-effective use of existing resources and have potential applicability in other Bureau settings.
Other Deficiencies
This section lists problems or weaknesses the reviewer noted. The reviewer will include a one or two sentence summary of the problem and, if applicable, a reference number of policy(ies), regulation(s), or ACA standard(s). Those deficiencies that need a separate, specific response from the review site will be noted as “response required.” During discussions with the department head, the reviewer must ensure the department head has an understanding of what action is required to remedy the situation.
Deficiencies or need for improvement not considered significant enough to be included in the program review report will be conveyed to the department head and documented in the working papers. The RIC will ensure the department head initials the working papers to verify advisement.
The RIC may also prepare a separate document known as the “Advised List,” listing issues not considered significant enough to warrant inclusion in any part of the program review report. This document will be distributed to the CEO, regional administrator, and department head; a copy will be placed in the official program review file with the working papers. Because the “Advised List” is not included in the program review report, no response is necessary.
PROGRAM REVIEW FOLLOW-UP
The follow-up phase begins immediately after the program review report is distributed and continues until the review authority closes the review officially.
Responsibilities
The responsibilities for program review follow-up are divided between the reviewer and the institution as follows:
Responsibilities of Reviewer
It is the RIC’s responsibility to keep the review authority informed as to the adequacy of the response and corrective actions taken by the institution. It is also the RIC’s responsibility to ensure timeliness of the request for closure is within established time frames, the review closure is warranted, and that a monitoring system is in place to follow up on “post-closure” long-term actions through the strategic planning process when applicable.
Responsibilities of Review Site
It is the responsibility of the review site’s CEO to respond to the review report in a timely manner, take appropriate actions to correct deficiencies and improve operations, and ensure adequate administrative controls and monitoring systems are in place to prevent the deficiencies from recurring. When applicable, long-term corrective action will be monitored through the strategic planning process. As a reminder, any corrective actions taken that affect working conditions of bargaining unit employees will be handled in accordance with the Master Agreement.
Responsibilities of Regional Program Administrator
Each discipline’s regional program administrator will monitor the implementation of corrective actions and placement of internal controls the CEO outlined in response to review findings. Furthermore, the regional administrator will work closely with the institution to develop strategic initiatives to address issues noted during the program review and the operational review.
Through the effective use of management indicators for vital functions and the strategic planning documents, the regional administrator should be able to assess the level of program performance from a distance and advise the department head on potential corrective action.
Response to Program Review Report
The CEO must respond to the review authority via BOPNet GroupWise (with electronic copies to the appropriate assistant/regional director) no later than 30 calendar days after receiving the report. The review authority must approve any exceptions (see the Management Control and Program Review TRM for a response sample). The CEO’s response must address:
Repeat Significant Findings
The CEO will provide a separate response to the Director through the regional director. The CEO must describe the measures and internal controls to be implemented to ensure the problem will not recur, as well as explain why the problem was not corrected from the prior review.
Repeat and Repeat Repeat Deficiencies
The CEO must describe the measures and internal controls that will be implemented to ensure the problem will not recur, as well as explain why the problem was not corrected from the prior review.
Other Deficiencies
The CEO must certify that all deficiencies listed in the program review report (including those involving significant findings) have been corrected. This can be a blanket statement with exceptions noted. If a specific response for a deficiency is requested in the program review report, the CEO must provide a separate response for the deficiency.
Normally, deficiencies from policy or regulation are not negotiable. They must be corrected timely, unless budget constraints or other justifiable constraints preclude compliance.
Any constraints must be explained and a realistic time frame for correction must be specified using the strategic planning process. If corrective action requires longer than 30 calendar days, a strategic action plan will be developed for each area as part of the closure process. These action plans will be evaluated as part of the request for closure from the CEO.
If the program review included multiple disciplines, such as Human Resource (Employee Development, Personnel, and Affirmative Action), the response should include all disciplines and not be separated into different responses that are submitted at different times.
If there are constraints in resolving deficiencies involving a significant finding, the response to that finding will be referenced and the constraints discussed therein.
Significant Findings and Recommendations
The CEO is required to respond to recommendations relating to significant findings cited by the RIC, declaring agreement or disagreement.
Agreement
If the CEO is in agreement, the steps taken or planned to comply will be listed, with a time frame for resolution specified.
Disagreement
Through discussions during the program review between the RIC, the department head, associate warden, and, when appropriate, the person reviewed, potential for disagreement with findings or recommendations should be reduced. However, the CEO may wish to present in the review response justification why the recommended action cannot or should not be taken and alternative methods of correcting the problem or improving the program. The review authority will make the final decision to accept or reject the CEO’s response.
Non-Policy Based Criteria
A Bureau reviewer is an official representative of, and reports directly to, the review authority (PRD SDAD). If the reviewer has determined that, in his or her professional judgment, an action should be taken to correct a problem (e.g., implement internal controls) or improve a situation (even if the criteria against which the condition was measured are not contained in policy or regulation), and if the review authority agrees with this judgment, it is incumbent upon the CEO to take such action or present adequate justification as stated above under “Disagreement.”
General Comments
The CEO will also review other sections of the program review report (Cover Memorandum, Background, General Comments, etc.) to determine if issues have been raised that require a response. The CEO must respond to issues identified in the General Comments section of the report if a required response is indicated. The CEO has the option to disagree with the General Comments item, but a response is still required.
Review of Response
The RIC will review the CEO’s response to ensure it is complete and all deficiencies have been corrected or the action plan contains an acceptable time frame for corrections. If there is a disagreement between the reviewer and the CEO regarding any finding or recommendation, the matter will be presented to the review authority for a decision.
Notification
The review authority will notify the CEO in writing of the acceptance or rejection of the response within 20 calendar days of receipt.
Follow-Up Reporting
Included in the review authority’s response may be the requirement for any follow-up reporting measures (progress reports, plans of action) to be taken on the CEO’s part. The requirement for these reports is on a case-by-case basis and may be used when the time frame for corrective action is over a long period or the implementation of adequate internal controls is a concern.
Closure of the Program Review
Before the review authority can close a program review, several actions are required by the RIC and institution to provide the review authority with the necessary assurance.
Follow-up Review by Institution
Prior to seeking closure of the program review, the CEO will ensure a follow-up review is conducted to determine whether adequate internal controls are in place to prevent the problem(s) from recurring.
Responsibility
The appropriate associate warden or management official is responsible for the follow-up review being conducted.
Review Team
The associate warden may conduct the review personally or may head a review team. A local option might include appointing other institution department heads as members of the review team to provide cross-discipline training. Another local option is to include the department head or staff of the department in question on the review team. Consideration should be given to the workload of the staff assigned to the team.
Time Frame
The follow-up review should be conducted 120 – 150 calendar days after the last day of the program review. This allows for sufficient time for internal controls, that have been put in place as a result of the review, to begin working.
Method
Each deficiency mentioned in the review report is to be examined to determine not only whether the deficiency has been corrected, but also whether adequate, cost-effective controls have been instituted to lessen the likelihood of recurrence. Such controls might include: an additional level of review, more frequent inspections, cross-checking systems, new written procedures, improved training, etc.
Any deficiency(ies) noted in the program review report that requires a separate, specific response from the review site must be examined.
In regards to a significant finding, the review team is to ensure the “condition” as well as the “cause” have been addressed and staff have implemented the reviewer’s “recommendations.”
Report
The associate warden will prepare a report of the review team’s findings within 14 calendar days of the follow-up review date and send it via BOPNet GroupWise to the review authority (with electronic copies to the assistant director for the discipline reviewed and the regional director) under cover memorandum from the CEO.
The report will address all deficiencies noted in the program review report that require a separate, specific response, all repeat deficiencies, all repeat repeat deficiencies, and all significant findings, to include whether the controls put in place to correct weaknesses or deficiencies have been effective (see the Management Control and Program Review TRM for a Follow-up Review Report sample). This memorandum can also be used to request closure of the program review (see “Request for Closure”).
Certification
The associate warden’s certification of correction of the deficiencies and adequacy of controls will be included in or attached to the report.
Request for Closure
When the CEO is confident that all necessary actions have been taken, he or she must submit electronically a request for closure of the program review (see the Management Control and Program Review TRM for Request for Closure sample).
Time Frame
Normally, closure of program reviews will be within 180 calendar days after the last day of the program review. If the CEO is unable to request the review’s closure within this time frame due to extraordinary circumstances, he or she may submit via BOPNet GroupWise a request for an extension from the review authority.
Requirements
In the cover memorandum to the review authority, the CEO will certify that he or she has reasonable assurance that all deficiencies noted in the program review report have been corrected and needed improvements have been made (except where noted elsewhere in the response), and that adequate controls are in place to prevent recurrence. An electronic copy of the follow-up review report will accompany the request for closure.
Assurance/Closure
When the review authority has obtained reasonable assurance the deficiencies have been corrected, the review authority will notify the CEO electronically the review is considered closed. Electronic copies of this notification will be sent to the appropriate assistant and regional directors and regional/Central Office administrator(s).
Exceptions
There are instances when limited resources or other restrictions preclude achieving full compliance within 180 calendar days. The review authority will consider such situations on a case-by-case basis. If the program is rated “at risk,” the CEO will determine when he or she is prepared to request closure. At that point, the CEO is to request closure through the regional director.
If the regional director concurs, the request is forwarded to the Director with a copy to the PRD SDAD. A full program review is then scheduled. If the situation is resolved fully or if the stated strategic plan to correct the problem over the long term is realistic and fully responsive to the review finding, the review can be closed. The review authority and regional administrator, however, must continue to monitor the progress against the established action plan through the strategic planning reporting system.
Assurance Methods
These include, but are not limited to, the written assurance by the CEO that the follow-up review confirmed correction of all deficiencies, an on-site visit by the reviewer, a member of the review team, or a knowledgeable third party from the regional office or another facility, or a follow-up review directed by the review authority.
CONDUCTING AN OPERATIONAL REVIEW
OVERVIEW
The operational review is a local evaluation process that enables staff to closely evaluate the strengths and weaknesses of a program and take corrective action.
The operational review is conducted under the authority of the CEO of each installation or organizational component. At the institution level, the review authority is the Warden. At the region or division level, the regional director or the assistant director is designated as the review authority. The community corrections regional administrator (CCRA) is the review authority for operational reviews of Community Corrections offices. For operational reviews of Transitional Services and CCRAs, the regional director is the review authority.
As part of the Bureau’s management control program, each program at all organizational levels should conduct an operational review between 10 – 14 months from the week of the previous program review (including those programs receiving a deficient rating).
An additional operational review should be conducted 22 – 26 months from the week of the previous program review for those programs that receive good or superior ratings.
Regional program areas that receive superior or good ratings should also conduct two additional operational reviews at 34 to 38 and 46 to 50 months. An operational review is not required for those programs that receive an “at risk” rating. Newly activated institutions will conduct operational reviews within the first 12 months after formal activation (i.e., issuance of the Operations Memorandum (OM) indicating the site’s activation).
Apart from these requirements, an operational review may be conducted at any time to determine program effectiveness.
By using this process effectively, weaknesses can be identified and corrected quickly through strategic planning. Action plans can be developed to ensure correction over time and the strengthening of the program. Further, the operational review process enables program managers to establish strong internal controls to ensure corrective action continues to be effective.
CONDUCTING AN OPERATIONAL REVIEW
An operational review includes the five phases of the program review process (preparation, examination, evaluation, reporting, and follow-up) discussed earlier in Chapter 2.
Responsibility
Responsibility for ensuring the operational review is conducted in accordance with policy rests with the appropriate associate warden, deputy regional director, or deputy assistant director. The CEO is the review authority for all operational reviews.
Members of Review Team
The review team RIC and its membership are at the CEO’s discretion. The RIC should demonstrate good organizational and communication skills, and a sound working knowledge of the operational review process. There is no requirement that the RIC be the department head of the program being reviewed, the review team can be made up of staff from any department. Consideration should be given to the workload of the staff assigned to the team.
It is essential that some team members be subject matter experts to ensure a comprehensive review is conducted and informed decisions are made regarding the review findings. It is the RIC’s responsibility to ensure the operational review is conducted thoroughly and impartially and the review authority is advised of all findings.
Preparation
The review team will review the national PRGs and adjust them as necessary based on concerns and high-risk areas of the program as perceived by institution staff.
Staff from related departments will be included in a meeting(s) to enable the review team to take a “big picture” approach to the review – that is, looking at areas outside their own department that may affect, and be affected by, the program being reviewed. Through this process, a comprehensive review of institution operations can be made and improve the effectiveness of the institution programs. Coordination for this interdepartmental meeting will be the responsibility of the associate warden, deputy regional director, or deputy assistant director.
A brief memo announcing the upcoming operational review will be prepared and forwarded to the CEO (see the Management Control and Program Review TRM for samples). For Community Corrections operational reviews, the memo announcing the upcoming operational review will be prepared and forwarded to the CCRA.
Examination and Evaluation of Evidence
In accordance with the standards of evidence described in Chapter 2, the operational review team is to conduct the review thoroughly and impartially. The RIC must examine the materiality of the evidence and the existence of deficiencies, significant findings and repeat deficiencies or findings will be determined using the following criteria:
Deficiencies
Generally reflect a deviation from policy, a weakness in internal controls, or noncompliance with an ACA standard.
Significant Findings
Findings are generally composed of a series of related deficiencies that, taken together, constitute a failure of the program component. A significant finding can also be caused by a single event that results in program failure.
Repeat Findings/Deficiencies
A repeat is the result of the failure of internal controls that were developed to correct a noted deficiency. In determining if a repeat exists, the evidence does not have to be a mirror image of the prior evidence. It is only necessary that the same condition exists. Repeat deficiencies/findings can be written based on prior program or operational reviews.
Report
The associate warden, deputy regional director, or deputy assistant director will submit the complete results of this review to the CEO, who acts as review authority, with a copy to the regional director (institution review) and the PRD SDAD, within 30 calendar days after the review is completed (see the Management Control and Program Review TRM for an Operational Review Report sample). For Community Corrections reviews, the RIC is to submit complete results to the CCRA, who acts as the review authority, with a copy to the regional director.
Certification
The associate warden, deputy regional director, or deputy assistant director will certify that the operational review was comprehensive and conducted in accordance with policy. Also, the certification is to include that findings and conclusions are supported by evidence contained in the working papers that are to be retained for review by the program review team during the next program review.
Working Papers
The department head or administrator of the program reviewed must retain the working papers for subsequent operational reviews as well as the report in an appropriately labeled file until the next scheduled program review has been conducted and a final report issued. During the next program review, the reviewers are to examine working papers from the operational review to determine that the review was comprehensive and that the adequacy of controls were assessed.
The effectiveness of corrective action will also be evaluated to serve as an indicator of the operational review program’s overall effectiveness. Working papers and associated correspondence for Community Corrections operational reviews will be maintained in CCM offices where the review takes place.
Closure of the Operational Review
The review authority will direct that a follow-up review be conducted to measure the effectiveness of corrective action. The follow-up review will be conducted 120 – 150 calendar days after the last day of the operational review. It will be under the associate warden’s supervision (institution reviews) and focus on areas of concern and deficiencies.
After the follow-up review is completed and it is determined that all controls are effective, the review authority can close the operational review. If there were no deficiencies or major concerns expressed or identified in the operational review report, no follow-up review is required, and the operational review may be considered officially closed.
Exemptions
The PRD SDAD, may grant an exemption to the operational review process when justified by the CEO and respective regional director or the Central Office assistant director.
MANAGEMENT ASSESSMENT PROCESS
OVERVIEW
A management assessment is a systematic method of assessing the strengths and weaknesses of a particular program/activity. It provides the opportunity to assist program managers to identify systems of control needed to ensure performance and compliance with applicable policies, regulations, and ACA standards.
Program Review Guidelines (PRGs) are developed to measure performance in meeting the identified program objectives. An in-depth management assessment will be conducted every three years. These PRGs may be reviewed and changed prior to the full management assessment using the midstream procedures.
PURPOSE
The management assessment’s purpose is to examine each component of a program in order to determine:
- Degree of vulnerability of the program to fraud, waste, abuse, and mismanagement.
- Potential for serious problems if policy and regulations are not followed, or systems of internal controls are not adequate.
- Degree to which resources are being used efficiently to satisfy performance requirements.
- Areas or processes where the reviewers should concentrate their limited time and resources.
METHOD/COMPONENTS
Management assessments are conducted in a conference setting at the Central Office, and time is set aside exclusively for the assessments. The major components of a management assessment are:
- A review of past and current performance, using available management indicator data/analyses.
- An assessment of the program’s level of risk and need for improved systems of control by means of a structured review methodology (risk analysis).
- A review and incorporation of all current mandatory and nonmandatory standards assigned to the discipline. Guideline steps supporting ACA standards cannot be modified and/or removed unless the standard itself has been revised/deleted from the ACA standards manual or the nonmandatory step risks out low.
PARTICIPANTS
Management assessment teams consist of a total of 10 participants including the:
- Central Office administrator(s),
- regional administrator(s),
- warden(s),
- associate warden(s),
- institution department head(s), and
- a PRD senior reviewer.
A PRD evaluation specialist will facilitate the management assessment, and events will be recorded by a staff member the discipline selects. Any deviations or changes in regard to location or team size must be submitted for approval by the PRD SDAD and assistant director over the discipline.
PREPARATION
Prior to the management assessment, meetings will be conducted with the Central Office discipline administrator(s) and PRD staff to discuss current guideline steps and changes in policy or procedures which may impact the assessment process. PRD will also solicit input from all CEOs on any issues or concerns with the current guidelines.
Information will be gathered and assembled for distribution to all participants. The information will include:
- mission statement of the program,
- current PRGs and vital functions,
- definitions and terminology,
- CEO responses,
- deficiency trends and analyses (e.g., Quarterly Summary Reports and review surveys),
- GAO/OIG information, and
- applicable ACA standards.
CONDUCTING THE ASSESSMENT
The assessment is performed by identifying and reviewing each major area of responsibility/activity of the program to determine:
- Program objectives.
- Inherent risks (worst-case scenarios without controls in place).
- Procedures or systems of control and their adequacy (e.g., policy, regulations, and oversight).
- Actual risk to the program’s mission based on the controls in place to address the identified inherent risks.
- Review procedures needed to measure program performance and compliance with policy, regulation, and ACA standards.
ASSESSMENT RESULTS
Results of the management assessment include the development of PRGs and may also result in the identification of strategic issues, systems of control, and necessary changes in policy. Guideline steps are required for all high-risk processes (as identified in the risk analysis) and are recommended for all medium-risk processes.
Guidelines should be written clearly, granting the reviewer the opportunity to observe a program activity, review pertinent documentation, and/or interview appropriate staff. Guidelines should not be written as survey questions, but will be direct and substantial, relating to exactly what the reviewer should do.
It is equally important to indicate the sample size of items to be reviewed. The sample size specified should be sufficient to determine compliance but should not be excessive and lengthen the review process.
To facilitate the use of guidelines for operational and program reviews, a policy citation or regulation with the appropriate chapter or section will be ascribed following each review step.
IRP requires that all applicable ACA standards for each discipline be addressed in the program review process. Therefore, applicable ACA standards will be included in formulating guidelines during the management assessment process and should be ascribed following the policy citation or regulation.
FORMAT OF PRGs
The format for each PRG is prescribed in this PS. Each document will include the following standard statements regarding vital functions and ACA standards:
- During the management assessment, vital functions for (name the discipline) were identified as follows: (list the vital functions and number them). The guideline steps that measure or evaluate each vital function are identified in the left margin with the notation: (V-1), (V-2), (V-3), etc.
- The following ACA standards are referenced in the attached PRGs: (list the ACA standard numbers). Review guidelines that measure or evaluate compliance with ACA standards are identified with the appropriate ACA number following any policy citations. Mandatory ACA standards are identified by bold print.
PRG ROUTING PROCEDURES
The PRD facilitator is responsible for preparing and routing the draft PRGs developed during the management assessment. To ensure PRGs are submitted to the Office of National Policy Review for publication within 90 business days from the management assessment’s completion, the following routing procedures/time frames have been established.
Within seven business days after the management assessment, the initial draft will be routed to the discipline for review and assurance of appropriate policy citations and applicable ACA standards for each guideline step.
The discipline will review, finalize (policy citations and applicable ACA standards), and return the draft to the PRD facilitator no later than 30 business days from receipt of the initial draft.
Within a period not to exceed 50 business days:
- the final draft will be prepared and routed within PRD for review;
- the final draft will be submitted for approval/signature of the PRD SDAD;
- the PRD facilitator will meet with the discipline’s program administrator(s) for review and approval of any modifications resulting from PRD’s internal review; and
- the discipline’s program administrator will then submit the draft for the respective assistant director’s approval/signature.
Upon receiving the approved draft (signed by the discipline’s assistant director), the PRD facilitator will prepare and submit the approved draft to NPR for publication within three business days from receipt of the final document. Institutions will be notified prior to implementation of new guidelines.
COMPONENTS OF GUIDELINES
It is the PRD facilitator’s responsibility to ensure that necessary documentation of the assessment is maintained. The PRD facilitator must retain documentation in an appropriately labeled file until the next management assessment is completed (every three years).
Program Objectives
Objectives should be clearly written and state the purpose of the program area/activity and the results or level of performance expected. For example, “to ensure all sentence computations are completed accurately to prevent untimely releases” addresses the level of performance expected (all/100 percent accuracy) and the expected results (prevent untimely releases). Vague objectives should not be used such as “to enhance, to improve.”
Background Statement
Under each objective will be a brief background statement indicating why this is a program review objective. For example, it may be a “high risk” area based on the management assessment, a life safety or statutory requirement, or an area that has consistently been a problem, such as overcrowding.
Program Review Steps
Directly under each program review objective and its background statement are the program review steps. The steps describe the work that is required to meet the program review objective. The steps should outline:
- the work to be done during the review,
- the specific documents to be examined,
- sampling techniques and sizes to be used,
- span of time to be reviewed,
- processes to be observed,
- persons to be interviewed, and
- purpose for the program review step.
The program review steps must be clear enough that a person who is not an expert in the program area or who is not an experienced reviewer can, with supervision, understand the program/operational review work that is required. Each review step must also cite the appropriate supporting reference.
An appropriate example would be: (PS 5500.03, CH 7, Sec 701) and ACA standards: 3-4023, 3-ALDF-4D-17. This specific citation will reduce the amount of time spent looking through policy when citing deficiencies, and it will enable line staff to become more familiar with specific policy requirements when preparing for or conducting an operational review or a program review. Assessing the adequacy of the evidence collected and organizing the evidence into findings remains the RIC’s responsibility.
The following is a sample format to be used in developing program review steps: Look at …(a specific activity, program, or program component) to determine …(specific objectives are being met or policy requirements complied with…). Two examples of guidelines follow that involve a reviewer observing a program first-hand, reviewing documentation, and interviewing staff:
- Observe an actual team meeting to determine whether staff are developing a financial responsibility plan at initial classification and program reviews.
(PS 5500.03, CH 7, Sec 701)
ACA: 3-4023, 3-ALDF-4D-17 - Examine five percent (not to exceed 25) of the central files of cases identified as participating in the Inmate Financial Responsibility Program (IFRP) and review Attachments A and B to determine whether they are completed and in the central file.
(PS 5500.03, CH 8, Sec 7)
MIDSTREAM REVISIONS
Midstream revisions to guidelines may be made at any time due to changes in policy, Executive Staff decisions, memorandums issued by assistant directors, etc., that occur prior to the three-year cycle for full management assessments. Once national policy has been published, the relevant program review guidelines will be modified if applicable, to reflect policy changes that affect the guidelines. These changes will occur as soon as practicable, ordinarily will not exceed six months. A memorandum outlining the requested change(s), purpose for change(s), suggested revision(s), and contact person should be routed to the PRD SDAD and assistant director for that discipline.
DOCUMENTATION
It is the PRD facilitator’s responsibility to ensure that necessary documentation of the assessment is maintained. The PRD facilitator must retain documentation in an appropriately labeled file until the next management assessment is completed (every three years).
CORRECTIONAL STANDARDS AND ACCREDITATION
INITIAL ACCREDITATION
Institutions will begin the initial accreditation process within 12 months of activation and request a Standards Compliance Audit with ACA within 24 months of activation. The Director can grant exceptions to this time table when requested by the regional director through the PRD SDAD.
An institution representative is required to attend the formal panel hearing before the CAC for initial accreditation. The Warden’s presence at the initial accreditation panel hearing is strongly encouraged. Institutions are encouraged to send a representative to subsequent panel hearings for reaccreditation. The SMS will provide funding for the institution representative to attend the panel hearing for initial accreditation or reaccreditation.
At each institution the local union president will be afforded the opportunity to hold a seat on the ACA accreditation committee, in accordance with Art. 10 of the Master Agreement.
Fees for accreditation and reaccreditation are to be paid through the existing contract, which the Bureau accreditation manager manages, between the Bureau and ACA.
Applicable Standards
Currently, three sets of ACA standards apply to Bureau operations:
- Standards for Adult Correctional Institutions 3rd Edition. These standards apply to Administrative Maximum Institutions, Penitentiaries, Federal Correctional Institutions, Federal Correctional Complexes, Federal Medical Centers, and Federal Prison Camps.
- Standards for Adult Local Detention Facilities 3rd Edition. These standards apply to Metropolitan Correctional Centers, Metropolitan Detention Centers, Federal Detention Centers, Jails, and the Federal Transportation Center.
- Standards for the Administration of Correctional Agencies. These standards apply to the Central Office.
The Warden will provide a copy to the local union president, upon request, of the current ACA standards applicable to the particular institution. This includes any subsequent supplements published. The national executive board of the Council of Prison Locals will be provided a copy of all current standards for all facilities, upon request.
Accreditation Timetable
The accreditation time table begins with the OM activating the institution. Once activated, the institution has 12 months to enter Correspondent Status with ACA and begin the accreditation process. Within 12 months of entering Correspondent Status, the institution must be prepared to invite the visiting committee to the institution for the on-site compliance audit.
Steps in the initial accreditation process include:
- Approximately 12 months after the institution’s activation, the Bureau accreditation manager makes an on-site visit to explain the accreditation process to staff and meet with the accreditation committee. The purpose of this visit is to assist specifically in:
- the role of the committee,
- preparation of files, and
- what the institution can expect during the auditor’s visit.
- The Warden will request, through the regional director to the PRD SDAD, that the Bureau accreditation manager forward the Task Order initiating the accreditation to ACA.
- Upon the Task Order’s issuance, the institution will interact with both the Bureau accreditation manager and the ACA regional manager on issues related to the accreditation process. Copies of all correspondence will be forwarded to both Central Office and regional office accreditation managers. Both the Bureau and regional accreditation managers provide assistance as required.
- Normally, the correspondence phase of the accreditation process requires up to six months. The institution should complete the self-evaluation six months after entering Correspondent Status. The institution enters Candidate Status after completing the self-evaluation.
- Once the institution has entered Candidate Status, it requests an on-site visit by the Bureau accreditation manager to conduct the final in-house audit and tentatively schedule the visiting committee audit with ACA.
- Assuming the in-house audit’s successful completion, the institution accreditation manager, in conjunction with the Bureau accreditation manager, will confirm the visiting committee audit with ACA.
- After the visiting committee audit, the Bureau accreditation manager provides assistance to develop appeals or plans of action for those standards found in noncompliance.
- An institution representative and the Bureau accreditation manager attend the accreditation hearing before the CAC to represent the institution. The regional accreditation manager is encouraged to attend this hearing.
- The institution representative will attend the awards ceremony to receive the institution’s certificate.
- Retention or maintenance of ACA files beyond initial accreditation is not required. Reaccreditation is accomplished through the program review process.
Institutions not ready to pursue accreditation consistent with the above time line, must request a waiver from the Director through the regional director and the PRD SDAD. This waiver is to be submitted in the form of a memorandum and should state the reasons for the request to delay the initiation of the process. Generally, a request for a waiver must be initiated within 14 months of activation.
REACCREDITATION (IRP)
Ongoing Monitoring of Compliance
The continuing accreditation of Bureau institutions is accomplished through the Bureau’s own program review process. Central Office program managers must ensure that PSs and PRGs reflect all standards applicable to the Bureau. PRGs will include all mandatory standards and nonmandatory standards provided to the discipline prior to the management assessment.
Consistent with this PS, program and/or operational reviews will be conducted in each program area annually. Accreditation managers must document these reviews and make them available to ACA auditors upon request. ACA auditors will place a special emphasis on program review findings which are linked to mandatory standards.
Accreditation managers should ensure that corrective actions and related documentation demonstrate ongoing compliance with associated mandatory standards. Central Office division accreditation managers will document any program and/or operational reviews conducted within their divisions.
Institution accreditation managers are responsible for documenting program/operational reviews conducted locally.
ACA On-Site Monitoring
Since the IRP relies on the program review/operational review process’ integrity, ACA auditors will accompany program reviewers during routine program reviews to confirm the process’ integrity and that all applicable standards are being addressed during operational and program reviews.
An ACA IRP on-site monitoring visit occurs once during an institution’s three-year period of accreditation. The Bureau accreditation manager will provide ACA with a current schedule of program reviews, and ACA will determine which program reviews will include an ACA monitor.
Approximately 60 days prior to the review, the PRD SDAD will notify the CEO and regional director of the upcoming ACA audit.
At the conclusion of the review, the RIC forwards a copy of the final report to the Bureau accreditation manager, who then forwards a copy to ACA.
Annual Certification
Each accredited institution and the Central Office must provide an annual certification report to the ACA documenting the following:
- Progress on action plans to address standards found in noncompliance during the initial audit.
- Identification of those program reviews conducted since the last annual report or hearing and the ratings received.
- New litigation regarding conditions of confinement initiated since the last annual report or hearing and its current status.
- An update on any significant occurrences at the institution since the last report or hearing (e.g., escapes, serious assaults, executive staff moves, mission change, etc.).
This report is due on the initial accreditation or reaccreditation anniversary date. The anniversary date is determined by the month (January or August) an institution appeared before the CAC. It should be routed through the Bureau accreditation manager in ample time (30 calendar days) to ensure it will be received in the ACA office prior to that date.
MONITORING VISITS
When an institution is required to receive an ACA monitoring visit, ACA will fund the visit’s cost. PRD can fund any related travel on the Bureau accreditation manager’s part only at the PRD SDAD’s discretion.
PARTICIPATION IN ACA-SPONSORED ACTIVITIES
Bureau staff participation in ACA activities and conferences is encouraged and valued. To ensure that participation is equitable, potential participants in national activities, who will be participating at government expense, must complete the Bureau’s Personnel Participation in ACA Activity form and forward it to the Bureau accreditation manager at least 30 calendar days prior to the scheduled event. Completing Attachment A is not required for participation in local events, such as meetings of ACA affiliates or ACA sponsored training.
PROPOSED ACA STANDARDS
Individuals wishing to submit new or revised standards for consideration by the Standards Committee must submit the proposed change(s) or addition(s) on the appropriate ACA form to the Bureau accreditation manager at least 60 days prior to the date the revision is due to ACA. The Bureau accreditation manager will ensure that the Bureau addresses all issues consistently and considers agency wide implications.
All proposed change(s) or addition(s) must be approved/submitted by the PRD SDAD to ACA.
BUREAU OF PRISONS PERSONNEL PARTICIPATION IN AMERICAN CORRECTIONAL ASSOCIATION ACTIVITY
Name:
Title:
Present Duty Station:
Telephone Number:
Event, Conference, Etc.:
Location:
List Elected or Appointed Offices Held in ACA or Affiliated Organizations:
Are You Presently at a Workshop, Training Event, or Other Function in Conjunction with Your Attendance at this Evens?
Yes/No:
If Yes, Please Identify:
List All ACA Activities You Have Participated in During the Last 12 Months:
LIAISON WITH EXTERNAL AUDIT AUTHORITIES
EXTERNAL AUDIT AUTHORITY
Any designated official from a government agency outside the Bureau organization authorized to conduct audits of a program, operation, practice, or procedure of a Bureau component. Examples are:
- the General Accounting Office,
- the Office of the Inspector General, Department of Justice,
- the General Services Administration, and
- the Office of Personnel Management.
This does not include interactions with the Office of Enforcement Operations. Such activities are coordinated through the Correctional Programs Division.
NOTIFICATION OF AN IMPENDING AUDIT
General Procedures
Official notification of an impending audit is directed to the Director with a copy to the PRD SDAD. Upon receipt in PRD, the affected component(s) will be determined by the PRD SDAD, the PRD planning and analysis administrator, and the PAS liaison.
Once the determination is made, PAS notifies the affected component(s). The PAS liaison schedules and arranges an entrance conference at a time and place agreeable to all parties.
Ordinarily, the entrance conference is held at the Central Office (PRD conference room), and its purpose is to identify the audit’s scope and parameters. When external auditing authorities visit institutions, the local union president will be notified by the institution, when at liberty to do so, and which may involve questioning bargaining unit staff.
After the entrance conference, the PAS liaison is to brief the PRD SDAD and, if appropriate, complete a written summary report of the meeting for the PRD SDAD.
On occasion, the PRD SDAD may direct the PAS liaison to schedule a meeting of Bureau staff prior to the entrance conference to ensure staff coordination, address concerns, and/or identify Bureau resource staff.
Direct Contact with a Component
If an external audit authority contacts a component directly, via telephone or mail, the component must notify the PAS no later than the close of business that day. Details outlining the review’s scope and specifics, along with any written notification, are to be forwarded to the PAS liaison immediately.
Unannounced Arrivals
Ordinarily, the Bureau receives prior notification of an external audit authority’s intent to review or inspect a particular site, but on rare occasions auditors may arrive unannounced. Should this occur, the CEO must request an entrance conference and contact the regional director and PRD SDAD for further guidance.
AUDIT CONTACT
Staff should exercise care in responding to auditor inquiries. Staff should be directed to respond only to questions they are qualified to answer. They should not answer if they are tentative or uncertain of the answer.
If Bureau staff refer the auditor to another staff person better qualified to respond to the question, the PAS liaison must be advised of the referral. It is important that the PAS liaison keep track of the source(s) of auditors’ information in case differences arise. Also, the component must forward all written responses (via E-mail if short time frames are involved) to PAS to ensure appropriate quality assurance review and timely submission to the audit authority.
In addition, the PAS liaison must keep both the national and local impact of the audit in sharp focus. External auditors may uncover issues which require immediate corrective action or timely policy modifications. Likewise, issues which may generate unusual public concern or be of particular interest to the media can surface during an audit.
In such situations, the component’s CEO must inform the regional director/assistant director and the PRD SDAD immediately. Also, the Office of Public Affairs must be contacted when media interests are likely.
EXIT CONFERENCE
Upon completing the actual auditing process, the external audit authority notifies the PAS. The PAS liaison is to schedule an exit conference with the Bureau component(s) and the external audit authority to provide opportunities for:
- Bureau staff to learn about and clarify tentative findings;
- Bureau staff and auditors to share ideas relative to tentative findings; and
- Bureau staff to take immediate corrective measures if warranted.
At the completion of the exit conference, the PAS liaison will brief the PRD SDAD and, if appropriate, complete a written summary report of the meeting for the PRD SDAD.
RESPONSE REPORTS TO “DRAFT” AND “FINAL” AUDIT REPORTS
If either a draft or final audit report is forwarded directly to the organizational component rather than the PAS, the component must forward the original copy to the PRD SDAD immediately for coordination and action.
Ordinarily, an external audit authority will only accept comments for up to 30 calendar days prior to publishing its final report and findings. Only the PRD SDAD may make a request for an extension on the Bureau’s behalf.
RESPONSE REQUIREMENTS
Full cooperation with external audit authorities is required and expected. Any questions concerning the disclosure of specific documents or information should be referred to the PAS.
- The PRD SDAD assigns initial responses to audit reports to the component(s) being reviewed.
- The PAS is responsible for coordinating and submitting all Bureau responses for the proper signature.
- Each Bureau response will express appreciation for the external audit authority’s report and state the Bureau’s position on the audit findings, including any planned actions. If the Bureau concurs with the findings and the proposed corrective action(s) are appropriate, the response will concur and address all infractions, deficiencies, and/or violations the audit authority cited.
- The time frames for taking corrective action and the implementation of controls to prevent a problem’s recurrence are to be described. Any delay in corrective action or the implementation of controls must be explained fully in the response. If the Bureau suggests an alternate solution to the proposed corrective action(s), all relevant details, as stated above, are to be included in the response.
- If the Bureau disagrees with an audit finding and/or recommendation, the response will include the rationale for the Bureau’s position.
DEFINITIONS OF TERMS USED IN PROGRAM STATEMENT
ACA Regional Manager
The ACA staff member assigned to have oversight for BOP accreditation activities. He/she also functions as the primary contact person for BOP accreditation managers.
Actual Risk
The risk of a step demonstrates the negative impact the discipline will experience if the step is not in place. The actual risk is assessed during the risk-out portion of the management assessment process and is determined and rated (H, M, L) based on the adequacy of controls in place to address the worst case scenarios (inherent risk).
Advised Item
A weakness in a program/operation which indicates a problem may be developing but does not totally meet the standards of evidence for it to be a deficiency. While not included in the program review report, an advised item should be brought into full compliance during the follow-up review phase.
Assurance Statement
A certification that the program/operation/agency is operating effectively, efficiently, and in compliance with applicable regulations; and that existing systems of internal control adequately protect the agency’s resources against fraud, waste, abuse, and mismanagement. The assurance statement must also identify any systemwide control weaknesses and actions taken or planned to correct the weaknesses in an appropriate and timely manner.
Conclusions
Interpretations of the evidence stated in relationship to the objectives of the review.
Deficiency
Problems or weaknesses noted by the reviewer which are in need of correction. In its broadest sense, a deficiency includes any condition needing improvement. A deficiency can include: noncompliance from policy/regulation; lack of adequate internal controls; poor or unprofessional practice; inefficient practice; ineffective results; poor quality, etc. A finding is usually based on several related deficiencies.
General Accounting Office (GAO)
The auditing arm of the Legislative Branch of the Federal Government given responsibility for monitoring the Executive Branch’s implementation of Congressional requirements. The GAO also sets minimum standards to be met in implementing Congressional mandates (e.g., internal control standards). The GAO is headed by the Comptroller General of the United States; however, its monitoring/auditing function encompasses programs as well as financial areas.
Impairments
Impediments to conducting a program review in accordance with standards, specifically GAO Standards relating to independence. These impediments can restrict the program review or interfere with a reviewer’s ability to form independent and objective opinions and conclusions. The impairment can be external, organizational, or personal.
External Impairments
Includes interference which limits or modifies the scope of a program review, restricts funds or other resources dedicated to the review organization, interferes with the assignment of personnel, overrules or influences the reviewer’s judgment as to the appropriate content of a report or selection of what is to be examined, and jeopardizes the reviewer’s continued employment with the agency or career advancement within the agency for reasons other than level of competence.-
Organizational Impairments
Review organizations should report results of the reviews and be accountable to the head of the agencies; reviewers should be removed from political pressures.
Personal Impairments
Include official, professional, personal, or financial relationships that might cause the reviewers to limit the extent of the inquiry, to limit disclosure, or to weaken findings in any way; preconceived ideas toward individuals or program objectives that could bias the review; previous involvement in a decision-making or management capacity that would affect current operations of the entity or program; biases that result from employment in, or loyalty to, a particular group or organization; and subsequent performance of a review by the same individual who, for example, had previously approved actions now under review or who maintained the official records now under review.
Inherent Risks
Worst-case scenarios that could prevent the accomplishment of the identified mission/objective.
Intensive Reaccreditation Process (IRP)
IRP combines the accreditation of Bureau institutions with the program review process to establish internal and external review of Bureau operations and programs.
Materiality
The significance of an item of information, given the circumstances, that allows a decision to be made.
Office of Management and Budget (OMB)
A function within the Executive Office of the President with responsibility for coordination of all management and budget activities of the Executive Branch of the Federal Government. OMB issues circulars which give guidance to other departments and agencies as to how Congressional acts are to be implemented and GAO Standards complied with (e.g., A-123 for internal controls, A-127 for accounting systems, A-130 for ADP systems, A-76 for contracting out activities, etc.).
Oversight Authority
The Bureau review function which is reserved for the Director, Bureau of Prisons, and is delegated to the PRD SDAD. Oversight includes the determination of whether reviews are conducted in accordance with the provisions of this program statement and government auditing standards.
Performance Indicators
Process of increment of measure used to define progress toward an objective and is ideally expressed numerically. Indicators can be measured as a percentage from an established baseline or raw number. It is important to define clearly what should be measured within established time lines (performance targets) and should indicate progress as well as accomplishment of program objectives. These are tools used by managers to determine if program objectives (components) are being accomplished.
Program
A major activity or functional area of the Bureau, such as staffing, dental care, prisoner transportation, staff training. Several similar programs may be grouped to form a branch (in the Central Office) or a department (in the institution).
Program Review
Work done in reviewing compliance with laws, regulations and policy, adequacy of controls, efficiency of operations, and effectiveness in achieving program results – also referred to as a review, test, inspection and includes exploring and developing all pertinent and significant information necessary to properly consider, support, and present findings, conclusions, and recommendations. Work can go beyond determining compliance with regulation and policy (expanded scope review).
Program Review Closure
The act of formally closing the file on a program review, requiring reasonable assurance on the review authority’s part that any improvements and corrective actions recommended by the reviewers have been taken.
Program Review Guidelines (PRGs)
The PRGs are the “road maps” developed by each program area to provide guidance to those staff who will be conducting program/operational reviews. Guidelines are developed via management assessments and provide the reviewer with the necessary information needed during the review to accurately assess the performance/results of the program/activity.
Program Review Objectives
The major part of the guidelines document which outlines the focus (level of performance and results expected) of a particular program or activity during the review cycle.
Program Review Report
The medium through which an RIC communicates the results of the review.
Program Review Schedule
An annual schedule of individual reviews to be conducted during a fiscal year.
Program Review Steps
These are the instructions placed directly under each specific objective which outline, in detail, the specific documents to be examined, sampling techniques to be used, span of time to be reviewed, analytical work to be done, observations to be made, persons to be interviewed, interview questions to be asked, etc. These steps must be detailed enough that they will be understandable by assistant or trainee reviewers who are included on the team primarily for on-the-job training purposes.
Recommendations
The courses of action specified in the report to correct problem areas and to improve operations. The suggested course of action can be based on deviations from policy as well as other deficiencies or need for improvement.
Repeat Deficiency
A deficiency that was also listed as a deficiency during the last program review. A repeat deficiency is the result of the failure of internal controls that were developed to correct a noted deficiency. In determining if a repeat exists, the evidence does not have to be a mirror image of the prior evidence.
Reviewer
A qualified, trained employee who conducts program reviews on behalf of the PRD SDAD.
Reviewer Access
The assurance that the reviewers will have complete access to all records, property, operations, personnel, and inmates during a program review.
Review Authority
The Bureau official under whom the program review is carried out and to whom the RIC reports. This official must be a member of the Bureau’s Executive Staff. In its broadest sense, the term review authority encompasses the official program review function of the Bureau delegated by the Director to assistant directors and regional directors.
Reviewer-In-Charge (RIC)
The reviewer that heads the program review team and reports directly to the review authority.
Risk Analysis
An intensive review of each component’s vulnerability in carrying out its mission or stated goals. This is accomplished by balancing the probability of failure against controls in place, thus rating the actual risk or potential damage which could occur.
Significant Finding
A pattern of events or single event normally linked to a program review objective that indicates a deficiency in an organization or organizational element. A finding is usually based on several related deficiencies. This determination is based on the sound professional judgment of the RIC.
Special Review
The examination of a particular subject area in more depth than accorded in a routine review. It may involve several different disciplines or programs (suicide prevention controls; crisis intervention effectiveness; SENTRY training, coordination and accuracy; A&O program effectiveness; etc.). This is still considered to be a program review and provisions of this program statement apply. This type of review usually requires a special set of objectives.
Strategic Management Cycle
Is the dynamic process of improving programs through gathering, analyzing, and using information which leads to timely, effective, and continuous planning. The strategy is to merge the present with the future and knowledge with the commitment to improve.
Strategic Planning
The process the Bureau uses to identify local, regional, and national objectives that are critical to the accomplishment of the mission of the Bureau. This process also calls for the development of action plans and steps which identify required resources, set completion time limits, and specifies individuals responsible for completion of the task.
Technical Assistance
In its broadest sense, technical assistance is a component of any review and the purpose is to improve operations. However, in the Bureau, program experts often visit institutions or offices solely to provide expert guidance in a specific, complex program area or a team of experts may be called in to assist institution staff after program reviewers have discovered serious deficiencies.
For this Program Statement’s purposes, technical assistance refers to a visit conducted for purposes other than a program review. Any summary reports of such a visit are prepared at the discretion of the regional or assistant director responsible for the visit.
Vital Functions
Those functions identified during the management assessment which must be performed to achieve at least a minimum level of successful performance. If controls are not in place to ensure current and future successful performance, the entire program is at risk and could result in failure to accomplish its mission. These areas are given special attention during reviews.
Working Papers
Documents that provide support for opinions, conclusions, and judgments. They aid in the conduct and review of the reviewer’s work. Include the collection of schedules, papers, analyses, correspondence, and other material prepared or obtained prior to and during the program review. They are to be retained a period of 5 years from the date of the review.
Published Feb 8, 2025 by Christopher Zoukis, JD, MBA | Last Updated by Christopher Zoukis, JD, MBA on Mar 3, 2025 at 1:25 pm