04/15/08     FY08 THIRD quarter Editorial updates (VOL.10) 8900.1 CHG 0

Volume 10 Air Transportation Oversight System

CHAPTER 1  General

Section 1  Air Transportation Oversight System Doctrine

10-1     Purpose. This section explains the underlying policy, concepts, and principles for the Air Transportation Oversight System (ATOS).

10-2     Statutory Authority. Title 49 of the United States Code (Title 49 U.S.C.) and Title 14 of the Code of Federal Regulations (14 CFR) provide the statutory and regulatory authority for ATOS, respectively. Title 49 U.S.C. is broad in scope and contains the codified provisions of the Federal Aviation Act of 1958, which prescribes the powers and authorities of the Federal Aviation Administration (FAA). Title 14 CFR is prescriptive in nature and contains the specific requirements to obtain an air carrier operating certificate and standards for conducting related operations. ATOS is not a separate safety standard and does not impose additional requirements on air carriers. ATOS imposes only requirements that are either explicit or implicit in the statute or the regulations. ATOS provides FAA inspectors with standardized protocols to evaluate air carrier programs required by regulations to be approved or accepted by the Administrator. The following requirements in 49 U.S.C. subtitle VII, Chapter 447, Safety Regulation, are particularly pertinent to ATOS.

A.     Title 49, Section 44702. Issuance of Certificates. “When issuing a certificate under this part, the Administrator shall consider the duty of an air carrier to provide service with the highest possible degree of safety in the public interest….”

B.     Title 49, Section 44705. Air Carrier Operating Certificates. “The Administrator of the Federal Aviation Administration shall issue an air carrier operating certificate to a person desiring to operate as an air carrier when the Administrator finds, after investigation, that the person properly and adequately is equipped and able to operate safely under this part and regulations and standards prescribed under this part.”

10-3     Policy Statement of the FAA as it Pertains to Promoting Aviation Safety for Air Carriers. ATOS is based on the explicit policy of the FAA, which states: “The FAA will pursue a regulatory policy, which recognizes the obligation of the air carrier to maintain the highest possible degree of safety.” ATOS implements FAA policy by providing safety controls (i.e., regulations and their application) of business organizations and individuals that fall under FAA regulations. Under ATOS, FAA’s primary responsibilities are: (1) to verify that an air carrier is capable of operating safely and complies with the regulations and standards prescribed by the Administrator before issuing an air carrier operating certificate and before approving or accepting air carrier programs; (2) to re‑verify that an air carrier continues to meet regulatory requirements when environmental changes occur by conducting periodic reviews; and (3) to continually validate the performance of an air carrier’s approved and accepted programs for the purpose of continued operational safety.

10-4     ATOS Concepts and Principles. ATOS relies on the following concepts and principles:

A.     Definition of Safety and Risk. Safety is the state in which the risk of harm to persons or property damage is reduced to, and maintained at or below, an acceptable level through a continuing process of hazard identification and risk management. In this context, an air carrier’s duty to provide service with the highest degree of safety in the public interest means that the air carrier must identify hazards in its operating environment and manage associated risks. Similarly, an air carrier’s ability to manage risk is an important part of the FAA’s determination to ensure that that the air carrier is equipped to operate safely under 49 U.S.C. and the regulations and standards prescribed by 49 U.S.C.

C.     System Safety. Properly designed systems control hazards by eliminating or mitigating associated risks before they result in accidents or incidents. In an operational context, air carriers fulfill their duty to provide service with the highest degree of safety in the public interest by designing their operating systems to manage hazard‑related risks in their operating environments. These concepts are fundamental to ATOS.

D.    Safety Attributes. The key to safety lies in managing the quality of safety‑critical processes. This is a primary responsibility of an air carrier in meeting its regulatory obligations. ATOS employs six safety attributes to evaluate the design of air carrier operating systems:

1)      Procedures—Documented methods to accomplish a process.
2)      Controls—Checks and restraints designed into a process to ensure a desired result.
3)      Process Measures—Used to validate a process and identify problems or potential problems in order to correct them.
4)      Interfaces—Interactions between processes that must be managed in order to ensure desired outcomes.
5)      Responsibility—A clearly identifiable, qualified, and knowledgeable person who is accountable for the quality of a process.
6)      Authority—A clearly identifiable, qualified, and knowledgeable person who has the authority to set up and change a process.

E.     The attributes are not standards in and of themselves, but provide a structure for the tools used to collect data for principal inspectors so that they can make informed judgments about the design of an air carrier’s operating systems (1) before approving or accepting them when required to do so by the regulations, and (2) during recurring assessments for continued operational safety.

F.      Focus on an Air Carrier’s Organization and Processes. In addition to issuing certificates, monitoring compliance, investigating noncompliance, and administering sanctions, for noncompliance, FAA oversight must also focus on an air carrier’s organization and process management. Outputs and outcomes are still monitored, but the emphasis is on maintaining a safe process or correcting deficiencies. Performance assessments must supply objective evidence of both the adequacy and inadequacy of processes.

G.    Open System Perspective. A successful open system adapts to the needs of the environment and its resources. Safe operation in the modern aviation environment requires constant adaptation. Air carriers are obligated to provide systems that defend against the hazards of their operating environments, including adapting to changes in the environment. Data Collection Tools (DCT) should provide information on current environmental risks and on the air carrier’s efforts to control them.

H.    Interdependence and Collaboration. The FAA is responsible for reaching an independent assessment of an air carrier’s qualification to hold an operating certificate and its continuing ability to comply with regulations and standards. The FAA may accomplish its independent assessments using data provided by an air carrier or a third party. Data sharing, collaboration, and open communication optimize the function of the oversight system and leverage resources to advance safety.

I.       Freedom of Information Act. Requests for records made under the Freedom of Information Act (FOIA) are processed in accordance with FAA, Department of Transportation (, and government‑wide directives and guidance. FAA Order 1270.1, FOIA Program, provides guidance that governs processing requests for FAA records under FOIA.

RESERVED. Paragraphs 10-5 through 10-19.


Volume 10 Air Transportation Oversight System

CHAPTER 1 General

Section 2  Introduction to ATOS Business Process and Tools

10-20           Purpose. The Air Transportation Oversight System (ATOS) improves the certification and surveillance processes for air carriers. It assesses the safety of air carrier operating systems using system safety principles, safety attributes, risk management, and structured system engineering practices.

10-21           Oversight System Model. ATOS exists in relation to the aviation system that an air carrier uses to produce its goods and services. The air carrier is the process owner of its aviation system, which is a production system in that it serves customers via products and services. The FAA is the process owner of the oversight system, which is a protection system. Protection systems are designed to protect customers from receiving inferior goods and services, and from potential harm of production activities. This includes potential harm from airplane accidents, occupational hazards, loss of equipment and other property, and damage to the environment. Safety and quality management systems are also protection systems. FAA and air carriers are the process owners of such complementary systems. The relationship between production and protection systems is a matter of exchanging information and exerting influence. Protection systems influence production systems by imposing controls. Figure 10‑1 shows this relationship at a high level.

Figure 10‑1, Oversight System Model—Level I

Figure 2-1. Oversight System Model Level 1

 

A.     Major Functions of ATOS. Three major functions further define the oversight system: design assessment, performance assessment, and risk management. The following Level II diagram shows these functions in a high-level model.

Figure 10‑2, Oversight System Model—Level II

B.     Design Assessment. Design assessment is the ATOS function that ensures an air carrier’s operating systems comply with regulations and safety standards, including the requirement to provide service at the highest level of safety in the public interest. Design assessment is the most important function of ATOS because safety is the outcome of a properly designed system. Poor system design compromises safety risk management. ATOS certification processes ensure that an air carrier’s operating systems comply with the intent of the regulations. ATOS uses standardized, systematic certification processes to determine an air carrier’s qualification for an operating certificate. FAA uses similar processes to approve or accept a new or changed air carrier program. The tools used in the certification processes also re-verify that an air carrier is meeting regulatory requirements during periodic program reviews or when environmental changes occur.

C.     Performance Assessment. FAA inspectors conduct performance assessments to confirm that an air carrier’s operating systems produce intended results, including mitigation or control of hazards and associated risks. ATOS uses time‑based performance assessments to detect latent, systemic failures that may occur due to subtle environmental changes. Performance assessment schedules are also adjustable based on known risks or safety priorities. Surveillance provides information for performance assessments and risk management. In this context, surveillance is synonymous with auditing. ATOS audits use the same tools as certification processes.

D.    Risk Management. Risk management identifies and controls hazards and manages FAA resources according to risk-based priorities. This function is accomplished through systematic risk assessments of an air carrier’s performance and environment. Hazards are defined in terms of their potential consequences. The likelihood and severity of a consequence determines risk. ATOS assesses the combined effects of likelihood and severity to determine priority when multiple risks are identified. Subsequent risk management action plans contain strategies to transfer, eliminate, accept, or mitigate the risk. This process validates the intended results of an action plan to ensure that a hazard is effectively eliminated or controlled. ATOS uses a modified version of Nicholas J. Bahr’s System Safety Process Model for hazard identification and risk management (see Figure 10-4). ATOS is concerned with the hazards and associated risks that are subject to regulatory controls such as enforcement actions, certificate amendments, and rulemaking. Hazards that are identified as the responsibility of an air carrier are tracked in the ATOS Risk Management Process until the air carrier satisfactorily resolves them.

10-22           Business Process Modules. A Level III model of the oversight system further defines ATOS processes at a procedural level.

A.     There are eight business process modules in the design and performance functions. The application of each process may be somewhat different depending on whether conducting a design assessment or performance assessment, but the overall purpose of each module is as follows:

1)      Module 1—System Configuration. The System Configuration process assesses an air carrier’s or applicant’s request for a new or changed scope of operation to develop an oversight profile that contains all applicable elements.
2)      Module 2—Planning. Planning develops a risk‑based data collection plan for design and performance assessments.
3)      Module 3—Resource Management. Resource Management provides the resources, training, and funding to execute the data collection plan.
4)      Module 4—Data Collection. The Data Collection process collects the data requested in the data collection plan.
5)      Module 5—Data Reporting. The Data Reporting process transfers the collected data into the ATOS database.
6)      Module 6—Data Review. The Data Review process evaluates data in the ATOS database for compliance with the requirements in the data quality guidelines.
7)      Module 7—Analysis and Assessment. Analysis and Assessment makes a data‑based decision about whether to approve or accept or reject the design or performance of an air carrier’s or applicant’s programs.
8)      Module 8—Action Determination and Implementation. Action Determination and Implementation decides on and executes the appropriate course of action based on the decisions made during analysis and assessment.

B.     Flow Chart. The Level III model consists of cross-functional flowcharts for each of the business process modules. Cross‑functional flowcharts show responsibility for process steps using horizontal bands. Standard flowchart symbols used to depict ATOS processes include:

Figure 10‑3, Standard ATOS Business Process Flowchart Symbols

Figure 2-3. Standard ATOS Business Process Flowchart Symbols

10-23           Application of System Safety Concepts. The overall purpose of a system safety‑based approach like ATOS is to identify, eliminate, or control hazards, and mitigate the associated risk.

A.     Below is the AFS System Safety Process Model, a modified version of the Nicholas J. Bahr system safety process model.

Figure 10‑4, System Safety Process Model

B.     Comparing the ATOS model to the System Safety Process Model demonstrates how ATOS is a system safety‑based approach to air carrier oversight.


Figure 10‑5, Comparison of the ATOS Business Process and the System Safety Process

System Safety Process Step

ATOS Business Process

Define Objectives and Describe System.

Define the acceptable levels of safety.

How does the system work and how do its components interact?

Module 1 System Configuration.

Regulations and policy define the acceptable level of safety. System description begins with the Air Carrier Oversight Profile.

Hazard Identification.

Where are the hazards in the system?

What can go wrong?

Modules 2–6.

Identify areas where conditions in the system or operating environment may be creating hazards. Plan, report, and review data collection in those areas.

Risk Analysis and Assessment.

Determine the potential consequences that could result if hazards are not addressed or corrected.

Module 7 Analysis and Assessment.

Analyze collected data to identify systemic hazards that should be addressed or corrected.

Decisionmaking.

What can be done to control the effects of the hazard and/or mitigate the associated levels of risk?

Module 8 Action Determination and Implementation. Take action to control the effects of the hazard and/or mitigate unacceptable levels of risk.

Validation of Controls.

Did the action work?

Module 8 Action Determination and Implementation. Determine if the action eliminated the hazard or lowered the level of risk to acceptable levels. If not, take additional action.

10-24           Air Carrier Systems, Subsystems, and Elements. ATOS uses a structured process to analyze how systems, subsystems, and elements interact. Seven air carrier systems form the basis for the ATOS system‑based approach. Each of these systems has a defined set of subsystems and elements. Elements are interrelated activities or actions completed to support air carrier subsystems and systems.

A.     The following are the seven air carrier systems:

1)      Aircraft Configuration Control. An air carrier maintains the physical condition of the aircraft and its components using this system.
2)      Manuals System. This system controls the information and instructions to define and govern the air carrier activities.
3)      Flight Operations. This system pertains to aircraft movement.
4)      Personnel Training and Qualifications. Air carriers use processes to make sure that their personnel are trained and qualified.
5)      Route Structures. An air carrier uses this system to maintain facilities on approved routes.
6)      Airman and Crewmember Flight, Rest, and Duty Time. This system prescribes time limitations for air carrier employees.
7)      Technical Administration. Air carriers use this system to address other aspects of certification and operation, such as key management personnel.

B.     Figures 10‑6 and 10‑7 at the end of this section identify each of the systems, subsystems, and elements (along with associated inspector specialties) used in planning and executing ATOS data collection.

10-25           ATOS Tools. ATOS standardizes the certification and surveillance processes. Its structured automation tools include:

A.     Air Carrier Configuration Checklist. The Air Carrier Configuration Checklist helps Certification Project Teams (CPT) and Certificate Management Teams (CMT) to document the air carrier’s or applicant’s scope of operation including factors such as type of operations, aircraft, equipment and operations specifications. This information is used for automated filtering of the oversight profile.

B.     Air Carrier Oversight Profile. This profile is a tailored list of elements, DCT questions, and job task items that are based on the specific regulatory requirements (SRR) that apply to the air carrier or applicant.

C.     The Air Carrier Assessment Tool (ACAT). This tool uses risk indicators to look for conditions that may be creating hazards in the air carrier’s systems. The results of the ACAT help to prioritize oversight activities.

D.    The Comprehensive Assessment Plan (CAP). This tool helps plan design assessments and performance assessments.

E.     Outsource Oversight Prioritization Tool (OPT). The Outsource Oversight Prioritization Tool (OPT) is used for planning surveillance of air carrier outsource providers. Principal avionics inspectors and principle maintenance inspectors must use this tool during oversight planning. It allows for prioritization of outsource maintenance providers to help determine specific data collection requirements. The OPT will assist the principal inspector, other assigned inspectors, supervisors, and managers in identifying areas of concern or criticality about outsource providers and target resources toward the highest risk outsource maintenance providers.

F.      Data Collection Tools. These ATOS tools support the assessments:

1)      Safety Attribute Inspection (SAI). Inspectors use the Safety Attribute Inspection questions to collect data for design assessment. Air carrier applicants use SAI DCTs during initial certification to document the results of their self audit.
2)      Element Performance Inspection (EPI). Inspectors use the EPI questions to collect data for performance assessment.
3)      Constructed Dynamic Observation Report (ConDOR). This tool is used for focused, special inspections.
4)      Dynamic Observation Report (DOR). This tool allows inspectors to record on-the-spot safety observations outside the planned oversight process.
5)      Assessment Determination and Implementation Tool. The principal inspector or certification program manager uses this tool to document the bottom‑line design or performance assessment and the appropriate course of action for implementation.
6)      Off-Hour Surveillance Decision Aid. The CMO/CMT Comprehensive Assessment Plan (CAP) must be capable of detecting deficiencies in air carrier performance no matter when the activity is conducted. Sufficient off-hour surveillance must occur to:
a)      Know what types and levels of activity are conducted during off hours.
b)      Understand how the air carrier is managing and supervising off-hour activities, especially the interface with outsource maintenance and other contracted activities.
c)      Determine if the air carrier’s processes and controls are sufficient to detect and correct any risks inherent with off-hour activities.
d)      Determine if the off-hour activities present a greater risk than activities done during normal FAA duty hours.
7)      Based on information gathered during surveillance, the CMO/CMT will be prepared to evaluate the air carriers’ ability to manage activities conducted during off-hour periods and address any identified risks.

10-26           ATOS Process Feedback and Continuous Improvement. ATOS uses a feedback loop to aid in its effectiveness. Inspectors should submit their concerns or recommendations using the Problem Reporting and Feedback feature in ATOS automation.


Figure 10‑6, ATOS System/Subsystem/Element Chart—Airworthiness Elements

1.0 Aircraft Configuration Control

2.0 Manuals

1.1 Aircraft

2.1 Manual Management

1.1.1

Aircraft Airworthiness

2.1.1

Manual Currency

1.1.2

Appropriate Operational Equipment

2.1.2

Content Consistency Across Manuals

1.1.3

Special Flight Permits

2.1.3

Distribution (Manuals)

1.2 Records and Reporting Systems

2.1.4

Availability (Manuals)

1.2.1

Airworthiness Release/Log Book Entry

2.1.5

Supplemental Operations Manual Requirements

1.2.2

Major Repairs and Alterations Records

 

 

1.2.3

Maintenance Log /Recording Requirements

4.0 Personnel Training and Qualifications

1.2.4

Mechanical Interruption Summary Reports

4.1 Maintenance Personnel Qualifications

1.2.5

Service Difficulty Reports

4.1.1

Required Inspection Item Personnel

1.2.6

Aircraft Listing

4.1.2

Maintenance Certificate Requirements

1.3 Maintenance Organization

4.2 Training Program

1.3.1

Maintenance Program

4.2.1

Maintenance Training Program

1.3.2

Inspection Program

4.2.2

Required Inspection Item Training Requirements

1.3.3

Maintenance Facility/Main Maintenance Base

4.2.12

Hazardous Materials Training

1.3.4

Required Inspection Items

4.4 Mechanics and Repairmen Certification

1.3.5

Minimum Equipment List/Configuration Deviation List/Deferred Maintenance

4.4.1

Recency of Experience

1.3.6

Airworthiness Directive Management

4.4.2

Display of Certificate

1.3.7

Outsource Organization

4.4.3

Privileges Airframe and Powerplant

1.3.8

Control of Calibrated Tools and Test Equipment

4.4.4

Privileges and Limitations for Repairmen

1.3.9

Engineering/Major Repairs and Alterations

5.0 Route Structures

1.3.10

Parts/Material Control/Suspected Unapproved Parts

5.1 Approved Routes and Areas

1.3.11

Continuous Analysis and Surveillance

5.1.1

Line Stations (Service & Maintenance)

1.3.12

Special Federal Aviation Regulation (SFAR) 36

5.1.2

Weather Reporting/Supplemental Aviation Weather Reporting System

1.3.13

Designated Alteration Station

5.1.3

Non‑Federal Navigational Aids

1.3.14

General Maintenance Manual or Equivalent

5.1.4

Altimeter Setting Sources

1.3.15

Reliability Program

5.1.8

Extended Operations

1.3.16

Fueling

5.1.9

Reduced Vertical Separation Minimum

1.3.17

Weight and Balance Program

6.0 Airman and Crew Flight, Rest, and Duty Time

1.3.18

Deicing Program

6.2 Maintenance Personnel

1.3.19

Lower Landing Minimums

6.2.1

Maintenance Duty Time Limitations

1.3.20

Engine Condition Monitoring

7.0 Technical Administration

1.3.21

Parts Pooling

7.1 Key Personnel

1.3.22

Parts Borrowing

7.1.1

Director of Maintenance

1.3.23

Short‑Term Escalations

7.1.2

Chief Inspector

1.3.24

Coordinating Agencies for Suppliers Evaluation

7.1.3

Director of Safety

1.3.25

Cargo Handling Equipment, Systems and Appliances

7.1.6

Maintenance Control


Figure 10‑7, ATOS System/Subsystem/Element Chart—Operations and Cabin Safety Elements

1.0 Aircraft Configuration Control

4.0 Personnel Training and Qualifications

1.1 Aircraft

4.2 Training Program

1.1.2

Appropriate Operational Equipment

4.2.3

Training of Flight Crewmembers

 

 

4.2.4

Training of Flight Attendants

2.0 Manuals

4.2.5

Training of Dispatchers

2.1 Manual Management

4.2.6

Training of Station Personnel

2.1.1

Manual Currency

4.2.7

Training of Check Airman and Instructors

2.1.2

Content Consistency Across Manuals

4.2.8

Simulators/Training Devices

2.1.3

Distribution (Manuals)

4.2.9

Outsource Crewmember Training

2.1.4

Availability (Manuals)

4.2.10

Aircrew Designated Examiner Program

2.1.5

Supplemental Operations Manual Requirements

4.2.11

Training of Flight Followers

 

 

4.2.12

Hazardous Materials Training

 

 

4.3 Crewmember and Dispatch Qualifications

3.0 Flight Operations

4.3.1

Pilot Operating Limitations/Recent Experience

3.1 Air Carrier Programs and Procedures

4.3.2

Appropriate Airman/Crewmember Checks and Qualifications

3.1.1

Passenger Handling

4.3.3

Advanced Qualification Program

3.1.2

Flight Attendant Duties/Cabin Procedures

5.0 Route Structures

3.1.3

Airman Duties/Flight Deck Procedures

5.1 Approved Routes and Areas

3.1.4

Operational Control

5.1.5

Station Facilities

3.1.5

Carry‑on Baggage Program

5.1.6

Use of Approved Routes, Areas and Airports

3.1.6

Exit Seating Program

5.1.7

Special Navigation Areas of Operation

3.1.7

Deicing Program

5.1.8

Extended Operations

3.1.8

Carriage of Cargo

5.1.9

Reduced Vertical Separation Minimum Authorization

3.1.9

Aircraft Performance Operating Limits

6.0 Airman and Crewmember Flight, Rest and Duty Time

3.1.10

Lower Landing Minimums

6.1 Airman and Crewmember Limitations

3.1.11

Computer‑based Recordkeeping

6.1.1

Scheduling/Reporting System

3.1.12

Hazardous Materials

6.1.2

Flight Crewmember Flight/Duty/Rest Time

3.1.13

Other Personnel with Operational Control

6.1.3

Flight Attendant Duty/Rest Time

 

 

6.1.4

Dispatcher Duty/Rest Time

 

 

7.0 Technical Administration

3.2 Operational Release

7.1 Key Personnel

3.2.1

Dispatch or Flight Release

7.1.3

Director of Safety

3.2.2

Flight/Load Manifest/Weight and Balance Control

7.1.4

Director of Operations

3.2.3

Minimum Equipment List/Configuration Deviation List Procedures

7.1.5

Chief Pilot

 

 

7.2 Other Programs

 

 

7.2.1

Safety Program (Ground and Flight)

RESERVED. Paragraphs 10‑27 through 10‑41.


Volume 10 Air Transportation Oversight System

CHAPTER 1 General

Section 3  The Certification Process for 14 CFR Part 121 Air Carriers

10-42   Introduction. This section provides a high level overview of the certification process of Title 14 of the Code of Federal Regulations (14 CFR) part 121 air carriers. Detailed certification information can be found in Chapter 2, Procedures for Design and Performance Assessment, (found in this section) and in Chapter 6, Sections 1 and 2 of this volume.

10-43   Certification Process. The certification process consists of four phases and three gates (see Figure 10‑8). A phase separates the certification process into related activities supporting a specific function. A gate is a set of prerequisites that must be met before proceeding to the next step. The process uses a structured, system safety-based approach to air carrier certification that assesses the design and performance of the applicant’s systems. This approach reviews the air carrier’s systems as an integrated whole rather than as separate parts. It incorporates the system safety concepts embodied within ATOS. The process uses two types of Data Collection Tools (DCT): Safety Attribute Inspections (SAI) and Element Performance Inspections (EPI). SAIs are used to collect data that help determine if the air carrier’s systems are designed to meet all regulatory requirements before Federal Aviation Administration (FAA) approval or acceptance. EPIs are used to collect data that help determine if the air carrier’s systems are performing as designed and are producing the intended results. During the certification process, the certificate-holding district office (CHDO) and AFS-900 will form a Certification Project Team (CPT).

10-44   Certification Process Document (CPD). The CPD, an appendix to this volume, provides step by step guidance to the CPT as they progress through the certification process. The CPD also provides links to detailed certification instructions, sample letters, forms, and briefings.

10-45   Project Management Tool (PMT). The PMT is an automated, Web‑based tool accessible only to the CPT. It is used to schedule certification tasks, manage work flow, document completion of tasks, monitor the status of the project, and store information used during the certification project.


Figure 10‑8, Phases and Gates for Initial Certification

RESERVED. Paragraphs 10‑46 through 10‑60.


Volume 10 Air Transportation Oversight System

CHAPTER 1 General

Section 4  Roles and Responsibilities

10-61   Responsibilities. All users of the Air Transportation Oversight System (ATOS) must use and maintain the system in accordance with the policies and procedures defined in this section.

10-62   Specific Responsibilities for ATOS.

A.     The Director of Flight Standards Service, AFS‑1.

1)      Provides the national policy and guidance for ATOS.
2)      Provides and maintains national policy and guidance for Certification Project Team (CPT) or Certificate Management Team (CMT) baseline training and staffing standards.
3)      Provides adequate regional resources to support ATOS processes.

B.     Flight Standards Certification and Surveillance Division, AFS‑900.

1)      Provides ATOS policy and procedures in accordance with ATOS, Flight Standards (AFS), and the Associate Administrator for Aviation Safety doctrine.
2)      Completes changes and updates for the system configuration process.
3)      Provides analysis and program support for the ATOS process.
4)      Develops air carrier certification and data collection policy and procedures.
5)      Collects feedback and completes changes and updates for all ATOS processes, and assesses ATOS process effectiveness.
6)      Continually improves ATOS using established processes for system engineering.

C.     Air Transportation Division, AFS‑200; Aircraft Maintenance Division, AFS‑300; and Flight Technologies and Procedures Division, AFS‑400.

1)      Participates as a stakeholder in the development and continuous improvement of ATOS.
2)      Ensures that inspector guidance material affecting air transportation oversight programs align with ATOS.

D.    Flight Standards Training Division, AFS‑500. Budgets for and provides the training that meets the needs of ATOS users.

E.     Flight Standards Quality Assurance Staff, AFS‑40.

1)      Reports directly to AFS‑1.
2)      Audits compliance with ATOS policy and procedures as well as evaluates the effectiveness of ATOS processes.

F.      Regional Flight Standards Division Offices.

1)      Implements ATOS and ensures that allocated resources (e.g., funding and trained personnel) are used in accordance with AFS priorities.
2)      Resolves issues that have been identified by the certificate‑holding district office (CHDO) or certificate management office (CMO).

G.    Office Manager Responsible for the CPT or the CMT.

1)      Ensures that staff are properly trained and oriented to their role in ATOS.
2)      Ensures that staff effectively execute their assigned ATOS responsibilities in accordance with established ATOS processes and procedures.
3)      Provides necessary leadership, support, and resources to ensure ATOS transition goals are met.
4)      Manages the certification projects and certificates for the office’s assigned air carrier(s).
5)      Determines and requests staffing, as well as requests baseline training to support ATOS processes. The CPT or CMT manager also notifies the Certification Project Manager (CPM) or principal inspector (PI) and if applicable, data evaluation program manager (DEPM) of any changes in CPT or CMT staffing.
6)      Receives input from the PI and identifies other training needs for CPT or CMT inspectors. Provides the air carrier‑specific familiarization portion of baseline training to all CMT members.
7)      Participates in the initial or annual planning meeting, monitors and track the progress of the CPT or CMT in completing the development of the Comprehensive Assessment Plan (CAP), and concurs with the completed plan.
8)      Ensures personnel under their supervision participate in this initial or annual planning process.
9)      Ensures that the CPT or CMT develops and manages a CAP with inspections targeted to the highest risk areas. Inspectors are assigned to those highest risk areas first.
10)  If the CPM or supervisors do not assign inspector resources to the highest risk areas first, or inspection frequencies are elevated without adequate justification, works with the CPM or supervisors until these conditions are met or concurs with the CAP.
11)  Ensures that the CPM or PI uses the tools and guidance found in this order whenever they become aware, through formal notification or informal channels, of conditions or indicators of changes, in the applicant or air carrier’s ability to balance resources and operational requirements.

Note:   The CPM or PI assesses risks, and if necessary, adjusts their assessment plans when applicants or air carriers make significant changes to their operations and maintenance programs (e.g., closing maintenance facilities, reducing personnel, outsourcing maintenance, and reducing gate turnaround times).

12)  Ensures that the primary work function and highest work priority of CPT or CMT members is accomplishing the activities that support design and performance assessments identified in the CAP.
13)  Obtains and provides resources to support the CAP, risk management process (RMP) development and accomplishment, and System Analysis Team (SAT) participation. This includes funding travel.

H.    Frontline Manager.

1)      Ensures that staff are properly trained and oriented to their role in ATOS.
2)      Ensures that staff execute their assigned ATOS responsibilities effectively in accordance with established ATOS processes and procedures.
3)      Provides necessary leadership, support and resources to ensure ATOS transition goals are met.
4)      Assigns work plans.
5)      Ensures the inspectors conduct their assigned work plans according to the CPM’s or PI’s specific inspection instructions.
6)      Resolves differences of opinion between reporting inspectors and data reviewers.
7)      Ensures that CMT or CPT maintains an accurate roster.
8)      Ensures that CMT or CPT maintains accurate make, model, and series (M/M/S) tables.

I.       Certification Project Managers or Principal Inspector. The CPM is responsible for the initial certification process, and the PI is responsible for the certificate management process.

1)      The following are functions of the CPM and PI.
a)      Reviews an applicant’s request for new or changed scope of operation and if accepted, tailors the ATOS elements and DCT questions to the requested scope of operation using the air carrier oversight profile.
b)      Collects and organizes information to complete an applicant or air carrier assessment, solicits input from team members, and makes decisions about oversight requirements.
c)      Prioritizes ATOS design and performance assessments by following ATOS planning procedures.
d)      Monitors the effects of industry changes and uses the change management tools in this notice to determine when retargeting oversight activities is required based on analysis of data or significant changes in the operating environment, such as financial distress, changes in the scope and scale of air carrier operations (growth or downsizing), or labor unrest, that may affect the applicant’s or carrier’s ability to balance resources, size, and organizational structure with operational requirements. Other triggers such as accidents, incidents, or occurrences could affect an applicant or air carrier.
e)      Participates in periodic meetings with the applicant or air carrier to stay informed on financial health and growth plans, or other conditions that might cause an imbalance between resources and operations.
f)        Provides specific instructions for completing inspections using the ATOS planning procedures.
g)       Identifies and brings aviation safety concerns to the analyst’s attention. PIs communicate their analysis needs to the Operations Research Analyst (ORA).
h)      Determines if the applicant’s or air carrier’s program and processes meet the standards for acceptance or approval.
i)        Determines, by element, the appropriate action for the results of a design assessment or a performance assessment.
2)      The following are functions that either the CPM or PI performs.
a)      The CPM ensures that all certification job functions are completed. The CPM notifies the CHDO manager of any information that may significantly affect or delay the certification project. The CPM ensures that individuals involved with the certification project and the CHDO manager are fully informed of the current status of the certification.
b)      The PI has the overall responsibility for the RMP.
c)      The PI analyzes risk and ensures the certificate holder addresses hazards by using the RMP to document rationale, develop action items, and monitor progress.
d)      The PI ensures certificate holder regulatory compliance and system adequacy through recurring design assessments.

J.      Operations Research Analyst.

1)      Supports the CMT in effective performance of safety analyses by providing guidance, expert advice, and data collection and analysis support.
2)      Collects, analyzes, and organizes associated applicant or air carrier data to clarify safety issues; plans and retargets data collection tasks; identifies potential hazards and consequences, and supports the RMP.
3)      Works closely with the CPM or PI to ensure that all pertinent data are reviewed thoroughly.
4)      Provides information to guide the CPT or CMT in conducting safety analyses. The ORA helps clarify safety issues by researching data and looking for trends, patterns, and generalizations.

K.    Data Evaluation Program Manager or Data Reviewer.

1)      Continually monitors the status of reports and records in the ATOS database and reviews them in accordance with the ATOS data quality guidelines.
2)      Promptly initiates actions necessary to resolve data quality issues or discrepancies.

L.     Aviation Safety Inspectors Assigned to a CMT.

1)      Participates in the planning activities to develop the CAPs.
2)      Schedules, coordinates, and accomplishes the work assignments using ATOS tools. Inspectors may work individually or as part of a team on SAIs.
3)      Accurately and promptly enters data collection results into the ATOS database in accordance with ATOS data quality guidelines.
4)      Submits reports using the Dynamic Observation Report tool when reporting observations that are relevant to safety goals, but are unplanned or outside the CAP.
5)      Reevaluates returned inspection records and decides on the appropriate action (e.g., editing the record, conducting additional observations, or taking no action).
6)      Promptly identifies unsafe conditions or possible regulatory violations observed during data collection, notifies appropriate personnel, and makes appropriate entries into Federal Aviation Administration data systems (e.g., Program Tracking and Reporting Subsystem, ATOS database).
7)      Follows established procedures to assist PIs in determining that the applicant or air carrier complies with its written procedures and meets its established performance measures.
8)      Performs qualitative reviews of available data that falls within their subject matter expertise.
9)      Supports PIs and performs tasks associated with the RMP, as assigned.

M.  Aviation Safety Technicians and Aviation Safety Assistants. Aviation safety technicians and aviation safety assistants who enter Safety Attribute Inspection or Element Performance Inspection activities for CPT or CMT inspectors transcribe these observations completely and accurately into the ATOS database.

RESERVED. Paragraphs 10‑63 through 10‑77.


Volume 10 Air Transportation Oversight System

CHAPTER 1 General

Section 5  Acronyms, Abbreviations, Terms, and Definitions

10-78   Acronyms and Abbreviations. The following acronyms/abbreviations and definitions apply to ATOS:

Figure 10‑9, Acronyms and Abbreviations

ACRONYM/
ABBREVIATION

DEFINITION

AC

Advisory Circular

ACAT

Air Carrier Assessment Tool

ACEP

Air Carrier Evaluation Process

ADI

Assessment Determination and Implementation

AFS

Flight Standards Service

AFS‑1

Director of Flight Standards Service

AFS‑40

Flight Standards Quality Assurance

AFS‑500

Flight Standards Training Division

AFS‑900

Flight Standards Certification and Surveillance Division

AQP

Advanced Qualification Program

ASA

Aviation Safety Assistant

ASAP

Aviation Safety Action Program

ASIAS

Aviation Safety Information Analysis and Sharing

ASI

Aviation Safety Inspector

AST

Aviation Safety Technician

ATOS

Air Transportation Oversight System

AW

Airworthiness

CAD

Continuous ATOS Development

CAP

Comprehensive Assessment Plan

CAS

Continuing Analysis and Surveillance

CAR

Civil Air Regulations

CD

Air Carrier Dynamics

CFR

Code of Federal Regulations

CHDO

Certificate‑Holding District Office

CMO

Certificate Management Office

CMT

Certificate Management Team

CPD

Certification Process Document

CPM

Certification Project Manager

CPT

Certification Project Team

ConDOR

Constructed Dynamic Observation Report

 

ACRONYM/
ABBREVIATION

DEFINITION

CSI

Cabin Safety Inspector

CSOP

Certification Services Oversight Process

CTL

Certification Team Leader

DCT

Data Collection Tool

DEPM

Data Evaluation Program Manager

DQG

Data Quality Guidelines

DOD

Department of Defense

DOR

Dynamic Observation Report

EC

Environmental Criticality

EFIS

Electronic Flight Information System

EPI

Element Performance Inspection

ETOPS

Extended Operations

FAA

Federal Aviation Administration

FOIA

Freedom of Information Act

FSAS

Flight Standards Automation System

FSDO

Flight Standards District Office

GPS

Global Positioning System

HAZMAT

Hazardous Materials

ICAO

International Civil Aviation Organization

IEP

Internal Evaluation Program

MEL/CDL

Minimum Equipment List/Configuration Deviation List

MIS

Mechanical Interruption Summary

MNPS

Minimum Navigation Performance Specification

MSL

Mean Sea Level

NTSB

National Transportation Safety Board

OAG

Official Airline Guide

OpSpecs

Operations Specifications

OPT

Outsource Oversight Prioritization Tool

ORA

Operations Research Analyst

OS

Operational Stability

OST

Office of the Secretary of Transportation

PAC

Preapplication Checklist

PASI

Preapplication Statement of Intent

PH

Performance History

PI

Principal Inspector

PMT

Project Management Tool

PTRS

Program Tracking and Reporting Subsystem

RMP

Risk Management Process

RVSM

Reduced Vertical Separation Minimums

 

ACRONYM/
ABBREVIATION

DEFINITION

SAI

Safety Attribute Inspection

SASO

System Approach to Safety Oversight

SAT

System Analysis Team

SDR

Service Difficulty Reporting Subsystem

SPAS

Safety Performance Analysis System

SRR

Specific Regulatory Requirement

TC

Team Coordinator (SAI)

U.S.C.

United States Code

VDRP

Voluntary Disclosure Reporting Subsystem

VIS

Vital Information Subsystem

 


10-79      Terms and Definitions. The following terms and definitions apply to ATOS.

Figure 10‑10, Terms and Definitions

Term

Definition

Acceptable Risk

The level of risk that is allowed to persist after controls are applied. Risk is acceptable when further efforts to reduce it would degrade the probability of success of the operation, or when a point of diminishing returns has been reached.

Aging Aircraft

An aircraft of any make or model that is 15 years old or older.

All‑Cargo Operations

Any operation for compensation or hire that is other than a passenger-carrying operation. If passengers are carried, they are only those specified in 14CFR § 121.583(a).

Air Carrier Assessment Tool (ACAT)

The ACAT documents a systematic review of the air carrier system elements using the risk indicators to identify conditions that may be creating hazards.

Air Carrier Configuration Checklist

A checklist provides PI or CPM with series of questions to answer that pertain to the air carrier’s or applicant’s type of operation, types of aircraft and equipment, facilities, personnel, and other operations specifications.

Air Carrier Dynamics

Aspects of the organization and environment that the air carrier directly controls and could enhance system stability and safety.

Air Carrier Oversight Profile

A tailored list of element and DCT questions that apply to a specific air carrier or applicant.

Air Carrier Programs and Procedures

The subsystem (3.1) by which an air carrier ensures compliance with its programs and procedures for functioning within its operating environment.

Air Carrier System

A group of interrelated processes that are a composite of people, procedures, materials, tools, equipment, facilities, and software operating in a specific environment to perform a specific task or achieve a specific purpose, support, or mission requirement for an air carrier. For the purposes of new certification and continued oversight, seven air carrier systems have been defined, including:

1.0 Aircraft Configuration Control

2.0 Manuals

3.0 Flight Operations

4.0 Personnel Training and Qualifications

5.0 Route Structures

6.0 Airman and Crewmember Flight, Rest, and Duty Time

7.0 Technical Administration

Aircraft

The subsystem (1.1) by which an air carrier ensures that its aircraft meet airworthiness and operational safety requirements.

Aircraft Configuration Control

The system (1.0) by which an air carrier maintains the physical condition of the aircraft and associated components.

Airman and Crewmember Flight, Rest, and Duty Time

The system (6.0) that prescribes time limitations for air carrier employees.

Airman and Crewmember Limitations

The subsystem (6.1) by which an air carrier ensures that airmen or crewmembers meet the regulatory time limitations.

Air Transportation Oversight System

The system the FAA uses to provide regulatory oversight of 14 CFR part 121 air carriers

Applicant

An individual, group, or organization seeking new operating authority; or in the case of an existing air carrier, a modification to their operating authority.

Approved Routes/Areas

The subsystem (5.1) by which an air carrier ensures that it maintains the facilities to support its approved routes and areas of operation.

Aviation Safety Information Analysis and Sharing

A facility for the integration, analysis, and sharing of aviation safety data and information.

Authority Attribute

A clearly identifiable, qualified, and knowledgeable person with the authority to establish and modify a process.

Certificate Management Team (CMT)

This team is responsible for the oversight functions of a specific air carrier. The CMT develops and executes a Comprehensive Assessment Plan tailored to a specific air carrier.

Certification Process Document (CPD)

An electronic document with work instructions to be accomplished during the certification process.

Certification Project Manager

(CPM)

The CPM is the primary Federal Aviation Administration (FAA) spokesperson throughout the Air Transportation Oversight System (ATOS) initial certification process. The CPM is responsible for ensuring that all certification job functions are complete.

Certification Project Team

(CPT)

This team is responsible for the oversight functions of a specific applicant during initial certification. The CPT develops and executes a Comprehensive Assessment Plan (CAP tailored to an applicant’s proposed operation).

Certification, Standardization, and Evaluation Team

A team of national technical experts responsible for providing assistance to certificate‑holding district offices in the full range of certifications and evaluations conducted on air carrier applicants or air carriers operating under 14 CFR part 121.

Comprehensive Assessment Plan (CAP)

The applicant or air carrier‑specific assessment plan developed by the CPT or CMT during initial certification or at the annual planning meeting. The CAP documents the planned assessments for the applicant or air carrier at the system element level.

Constructed Dynamic Observation Report (ConDOR)

The Constructed Dynamic Observation Report allows data collection activities to be requested or assigned with instructions to inspect and report on specific areas of immediate concern outside of the normal assessment schedule.

Control Attribute

Checks and restraints that are designed into a process to ensure a desired result.

Crewmember and Dispatch Qualifications

The subsystem (4.3) by which an air carrier ensures crewmembers and dispatchers are qualified.

Criticality

The likelihood that a failure of an air carrier system, subsystem, or element could lead to an unsafe condition.

Data Collection Tool (DCT)

DCTs such as Element Performance Inspections (EPI) and Safety Attribute Inspections (SAI) are automated tools used to record observations to provide data for design and performance assessments.

Data Evaluation Program Manager or Data Reviewer

The CPT or CMT member responsible for reviewing inspection reports and records to ensure they meet data quality guidelines.

Design Assessment

The ATOS function that measures an applicant’s or air carrier’s operating systems at the element level for compliance with the full intent of regulations and safety standards, including the requirement to provide service at the highest level of safety in the public interest.

Dynamic Observation Report (DOR)

The Dynamic Observation Report allows inspectors to record certain observations outside the comprehensive assessment planning process.

Data Quality Guideline

These guidelines help determine the acceptable levels of data quality during the evaluation of inspection records.

Element

One or more interrelated actions completed to support an air carrier subsystem. Elements are the level that design and performance is assessed for all 14 CFR part 121 carriers participating in ATOS.

Element Performance Inspection (EPI) Data Collection Tool

The ATOS tool designed to collect data to help the CPM or principal inspector determine if an air carrier adheres to its written procedures and controls for each system element, and that the established performance measures for each system element are met.

Environmental Criticality

Aspects of the air carrier’s surroundings that could lead to or trigger a failure in one of its systems, subsystems, or elements, and potentially create an unsafe condition.

Flight Operations

The system (3.0) that pertains to aircraft movement.

Frontline Manager

Frontline managers provide first‑level supervision to subordinate employees and manage the activities of one operating unit, project, or program area. Frontline managers report to middle or senior managers.

Gate

A set of prerequisites to proceed to the next step

Hazard

A condition, event, or circumstance that could lead or contribute to an unplanned or undesired event.

High Criticality

A high likelihood that a failure in this element could lead to an unsafe condition.

Human Factors

The relationship between people and their operating environment (e.g., people, procedures, materials, tools, equipment, facilities, and software, etc.).

Identified Risk

A level of risk that is identified through various analysis techniques.

Interfaces Attribute

The air carrier identifies and manages the interactions between processes.

Job Task Item

Job task items (JTI) are bulleted items below many DCT questions to detail the tasks that may be performed to properly answer the question. The inspector is required to answer the higher‑level EPI or SAI question, and should use the attached JTIs as guidance only.

Key Personnel

The subsystem (7.1) by which an air carrier ensures that qualified management and technical personnel with operational control are in place and conducting operations at the highest level of safety.

Low Criticality

A low likelihood that a failure in this element could lead to an unsafe condition.

Maintenance Organization

The subsystem (1.3) by which an air carrier ensures the continuous airworthiness and servicing of aircraft in accordance with its approved procedures.

Maintenance Personnel

The subsystem (6.2) by which an air carrier ensures that maintenance personnel meet duty time limitations.

Maintenance Personnel Qualifications

The subsystem (4.1) by which an air carrier ensures maintenance personnel are properly certificated and authorized to perform assigned duties.

Manual Management

The subsystem (2.1) by which an air carrier prepares and maintains the manuals for the use of and guidance to its personnel.

Manuals

The system (2.0) for controlling the information and instruction that defines and governs air carrier activities.

Mechanics and Repairman Certification

The subsystem (4.4) that an air carrier uses to ensure that airmen who approve aircraft for return to service are properly certificated.

Medium Criticality

A moderate likelihood that a failure in this element could lead to an unsafe condition.

Operations Research Analyst (ORA)

The operations research analyst is responsible for assisting the CMT in collecting and analyzing air carrier data.

Operational Control

The exercise of authority over initiating, conducting, or terminating a flight.

Operational Release

The subsystem (3.2) by which an air carrier ensures that all activities required for safe dispatch and continuation of a flight to its destination.

Operational Risk

An identified risk that has the potential to affect the operations of the air carrier.

Operational Stability

Aspects of the air carrier’s organization and environment that they do not directly control and when managed effectively, could enhance system stability and safety.

Operations Specifications

Legal and binding contract between an air carrier and FAA that documents specifically how the air carrier operation is conducted.

Outsourcing

The practice of contracting out internal air carrier programs and processes, such as maintenance, training, and ground handling, to external, independent vendors and suppliers. Oversight for the quality of the outsourced items remains with the air carrier.

Passenger‑Carrying Operation

Any aircraft operation carrying any person, unless they only persons on the aircraft are those identified in 14 CFR § 121.583(a). An aircraft used in a passenger-carrying operation may also carry cargo or mail in addition to passengers.

Preapplication Statement of Intent (PASI)

The completed PASI is a document used in initial certification that denotes an intent by the applicant to initiate the certification process and allows the FAA to plan activities and prepare to commit resources.

Performance Assessment

The ATOS function that measures an applicant’s or air carrier’s operating systems at the element level to confirm that the air carrier is following its procedures and that they are producing the intended result.

Performance History

The results of the air carrier’s operations over time.

Performance Measure

A description of the desired outcome of an air carrier element process. It is used to determine whether the desired results of that process were achieved.

Personnel Training and Qualifications

The system (4.0) by which air carrier personnel are trained and qualified.

Principal Inspector (PI)

The PI is the primary FAA spokesperson and decisionmaker for their specialty in all applications of ATOS.

Procedures Attribute

Documented methods for accomplishing a process.

Process

Policies and procedures designed to produce a desired result or end product for an air carrier.

Process Measurement Attribute

The air carrier measures and assesses its processes to identify and correct problems or potential problems.

Records and Reporting Systems

The subsystem (1.2) by which an air carrier manages the records used to show that the aircraft are airworthy; reflects the air carrier’s use of its procedures; and ensures the issuance of required reports.

Responsibility Attribute

A clearly identifiable, qualified, and knowledgeable person who is accountable for the quality of a process.

Risk

An expression of the severity of potential consequences and the likelihood of their occurrence that could result if a hazard is not addressed or corrected.

Risk Indicator

A grouping of safety‑ and/or performance‑related data that reflects an area of potential risk that is expected to have sufficient data or justification to calculate a representative value for a particular air carrier system, subsystem, or element.

Risk Management

An interactive management activity dedicated to assuring that risk is identified, documented, eliminated, or controlled within defined program risk parameters.

Risk Management Process (RMP)

This process is the ATOS function that identifies hazards and ensures that the air carrier either eliminates the hazards or controls the associated risk at acceptable levels. This process also allows the FAA to manage resources in accordance with risk‑based priorities.

Route Structures

The system (5.0) by which an air carrier maintains facilities on approved routes.

Safety

The state in which the risk of harm to persons or property damage is reduced to, and maintained at or below, an acceptable level through a continuing process of hazard identification and risk management. The quality of a system that allows the system to function under predetermined conditions with an acceptable level of risk.

Safety Attributes

The qualities of a system, (e.g., authority, responsibility, procedures, controls, process measurements, and interfaces) that should be present in well‑designed air carrier systems and processes.

Safety Attribute Inspection (SAI) Data Collection Tool

The ATOS tools used to collect data about regulatory compliance in order to assess the adequacy of the design of the processes associated with each system element for an air carrier. The tool is organized in accordance with six safety attributes.

SAI Team

The team of inspector(s) or a single inspector assigned to accomplish an SAI for a specific CPT or CMT.

Scope of Operation

Description of an applicant or air carrier’s activities in air commerce

Show Cause Order

In determining the “fitness” of an applicant for air carrier authority, the OST examines four areas of the applicant’s business: managerial competence, operating and financial plans, the compliance record, and ownership structure.

System

A group of interrelated processes which are a composite of people, procedures, materials, tools, equipment, facilities, and software operating in a specific environment to perform a specific task or achieve a specific purpose, support, or mission requirement for an air carrier.

System Analysis Team

A team that includes participants from the CMT, the air carrier, other FAA organizations, and other non‑FAA entities (e.g., the manufacturer) to accomplish further analysis and determine root causes of system deficiencies.

System Approach

The structured, safety‑driven means by which the FAA certifies and conducts oversight activities on elements that are designed to interact predictably within the air carrier’s systems and subsystems.

System Safety

The application of special technical and managerial skills to identify, analyze, assess, and control hazards and risks associated with a complete system. System safety is applied throughout a system’s entire lifecycle to achieve an acceptable level of risk within the constraints of operational effectiveness, time, and cost.

System Stability

The state of balanced constancy and safety that results when an air carrier is able to effectively manage both the aspects of their organization and their environment; those they control directly and those over which they have no direct control.

Technical Administration

The system (7.0) for addressing all other aspects of air carrier certification and operations.

Training Program

The subsystem (4.2) by which an air carrier ensures that personnel are trained to perform assigned duties in accordance with the air carrier’s approved programs.

Unacceptable Risk

Risk that cannot be tolerated by the managing activity. It is a subset of identified risk that must be eliminated or controlled.

RESERVED. Paragraphs 10‑80 through 10‑94.


Volume 10 Air Transportation Oversight System

CHAPTER 2 Procedures for Design and Performance Assessment

Section 1  Design and Performance Assessment System Configuration

Figure 10‑11, ATOS Module 1 System Configuration Flowchart

 

Module 1. System Configuration flowchart10-95            Introduction. The air carrier or applicant defines its scope of operations and develops core processes, procedures, and programs for Federal Aviation Administration (FAA) approval or acceptance. Understanding the scope of operations enables the FAA to develop an oversight profile for a particular applicant or air carrier. The oversight profile allows the principal inspector (PI) or certification project manager (CPM) to plan and conduct oversight activities that are specific to the air carrier’s or applicant’s system configuration.

10-96   Triggers for System Configuration. (See flowchart process step 1.1.) Three events that can trigger the system configuration process are initial certification, FAA‑initiated change, and air carrier‑initiated change.

A.     Initial Certification. An air carrier submits a formal application package requesting authorization to operate under Title 14 of the Code of Federal Regulations (14 CFR) part 121.

B.     FAA‑Initiated Change.

1)      The FAA issues or revises a regulation or policy that affects the air carrier system or its operating authority.
2)      As part of a design or performance assessment, the FAA may require the air carrier to revise one or more of its programs or systems, or modify one or more of its authorizations.

C.     Air Carrier‑Initiated Change.

1)      The air carrier proposes a change to its operations specifications (OpSpecs), such as:
a)      Cargo to Passenger, Supplemental to Domestic.
b)      Adding a new aircraft type.
2)      The air carrier proposes a change to its scope of operation that does not require a change to its OpSpecs, such as:
a)      Changes to training programs.
b)      Manual revisions.

10-97   Request a New or Changed Scope of Operation. (See flowchart process step 1.2.)

A.     Initial Certification. An applicant for a part 121 air carrier certificate submits a formal application package to the FAA.

B.     FAA‑Initiated Change. The Administrator may amend OpSpecs requirements or require program revisions when safety and public interest require such action. Amendments or revisions can result from significant changes to the certificate holder’s operating environment or safety concerns. Changes to policy or rules may also require an adjustment of OpSpecs. Regardless of who initiates the need for change, the air carrier must submit documentation sufficient for the FAA to evaluate the impact of the change on the air carrier’s configuration.

C.     Air Carrier‑Initiated Change. The air carrier may make application to amend its OpSpecs or programs by submitting an electronic proposal within the OpSpecs subsystem or a letter to the appropriate FAA office.

10-98   Review The Application for the Request. (See flowchart process step 1.3.) The PI or CPM determines if the application package contains the required documentation in the form and manner acceptable to the administrator before assessing and making a decision on the air carrier’s or applicant’s request.

A.     Initial Certification. The CPM must initially review the application and make a determination of its acceptability upon receipt of the formal application. The CPM reviews the package for content and quality. The CPM determines (1) whether the submitted material represents the air carrier’s scope of operations, and (2) whether the application package is of sufficient quality to allow for a productive formal application meeting and begin the certification process.

B.     Established Air Carrier. Upon receipt of a request for new or changed scope of operation, the PI reviews any documents submitted as part of the request for content and quality. The PI should account for the complexity of the amendment to evaluate the change. The PI must review the application to ensure that it includes an explanation and any supporting document or information to accurately assess the nature and scope of the proposal. This information allows the PI to determine the effect the request will have on the air carrier’s system and oversight profile.

10-99   Accept the Application for the Request. (See flowchart process step 1.4.) The CPM or PI follows the appropriate FAA guidance to determine whether to accept or reject an air carrier’s application for a new or changed scope of operations.

Initial Certification.

1)      Notification of Acceptance. If the application process is successful, the PI or CPM notifies the applicant in writing of acceptance of the formal application (see Figure 10‑12, Acceptance of Formal Application Letter).

10-100 Return the Application to the Air Carrier or Applicant for Modification. (See flowchart process step 1.6.) If the application requesting a new or changed scope of operation is incomplete or of insufficient quality, the PI or CPM must reject the entire application package and return it to the air carrier or applicant with a letter stating the reasons for its rejection.  (See Figure 10‑13, Rejection of Formal Application Letter.)

10-101 Update the Application. (See flowchart process step 1.7.) The air carrier or applicant reviews the PI’s or CPM’s comments, makes the appropriate modifications, updates the application package, and resubmits the updated application.

10-102 Tailor the Elements and Questions to the Requested Scope of Operation. (See flowchart process step 1.5.) If the PI or CPM accepts the application requesting a new or changed scope of operation, the PI or CPM tailors the ATOS elements and DCT questions to the requested scope of operation using the Air Carrier Configuration Checklist.

A.     Air Carrier Configuration Checklist. The PI or CPM completes the Air Carrier Configuration Checklist by answering a series of questions about the air carrier or applicant that pertain to the actual or requested type or kind of operation; aircraft make, model, series; certification and manufacture dates; powerplant and equipment; seating capabilities and accommodations; OpSpecs, personnel, and maintenance.

B.     Air Carrier Oversight Profile. Completing the configuration checklist creates an air carrier oversight profile which is a tailored list of elements and questions that are applicable to an air carrier’s or applicant’s scope of operation. The PI or CPM can manually modify the profile in the event the air carrier has a unique situation that results in differences from the standard profile, such as a deviation or exemption. The PI or CPM must provide an explanation for all manual adjustments to the air carrier oversight profile.

Figure 10‑12, Acceptance of Formal Application

[FAA Letterhead]

[Date]

Mr. Rockwell J. Jones

President and CEO, MidSouth Airlines

601 Sky Harbor Blvd.

Little Rock, Arkansas 72202

Dear Mr. Jones:

Your formal application has been reviewed and is acceptable. Our acceptance of the application does not convey specific approval of the attachments. Specific approvals or acceptance of the attachments will be appropriately conveyed after a detailed evaluation by the Federal Aviation Administration certification team.

We look forward to working with your personnel in the continuation of the certification process.

Sincerely,

 

 

John T. Smith

Certification Project Manager


Figure 10‑13, Rejection of Formal Application

[FAA Letterhead]

[Date]

Mr. Rockwell J. Jones

President and CEO, Mid South Airlines

601 Sky Harbor Blvd.

Little Rock, Arkansas 72202

Dear Mr. Jones:

This office has reviewed your formal application for an air carrier certificate, dated ________. We are returning your application because of deficiencies in the following areas:

Resumes of Harvey Anderson, director of Operations and S.F. Whipley, director of Maintenance, were not included in your application.

The compliance statement is incomplete. For example, 14 CFR part 121.XXX (Subject) was not addressed. Methods of compliance with this regulatory section are described in your company’s general manual attachment and should be appropriately referenced in the compliance statement. As previously discussed, all applicable regulatory sections must be addresses in the compliance statement. The minimum equipment list (MEL) does not contain maintenance and operations procedures as required on the master MEL.

We are returning your letter of application with all attachments. You must submit a new formal application when you have corrected all discrepancies noted above and any other omissions that exist. Please contact us if we can be of any further assistance in clarifying the minimum requirements for your formal application.

Sincerely,

 

 

John T. Smith

Certification Project Manager


Figure 10‑14, Air Carrier Configuration Checklist

If the response to any of the following questions is Yes, then…

·These elements apply

1. Does the certificate holder have Special Federal Aviation Regulations (SFAR 36) authorization?

1.3.12

2. Does the certificate holder have a designated alteration station (DAS) authorization?

1.3.13

3. Does the certificate holder use an approved reliability program?

1.3.15

4. Does the certificate holder have an engine condition monitoring program?

1.3.20

5. Does the certificate holder participate in a parts pool agreement?

1.3.21

6. Does the certificate holder use short-term escalation authorization for borrowed parts that are subject to overhaul requirements?

1.3.22

7. Does the certificate holder use short-term escalation?

1.3.23

8. Does the certificate holder use coordinating agencies for suppliers’ evaluation (CASE)?

1.3.24

9. Does the certificate holder conduct supplemental operations?

2.1.5

3.1.13

4.2.11

10. Is the certificate holder authorized to conduct Category II instrument approach and landing operations? (and/or)

11. Is the certificate holder authorized to conduct Category III operations? (and/or)

12. Is the certificate holder authorized to conduct automatic approach and landing operations (other than Categories II and III) at suitably equipped airports?

3.1.10

13. Does the certificate holder use an approved electronic recordkeeping system and/or an electronic flight bag?

3.1.11

14. Does the certificate holder operate aircraft with exit seats as defined in section 121.585?

3.1.6

15. Does the certificate holder carry cargo?

3.1.8

16. Does the certificate holder employ aircrew designated examiners (ADEs)?

4.2.10

17. Does the certificate holder use an Advanced Qualification Program (AQP)?

4.3.3

18. Does the certificate holder use flight attendants?

4.2.4

6.1.3

19 Does the certificate holder use dispatchers?

4.2.5

6.1.4

20. Does the certificate holder use certificated repairmen?

4.4.4

21, Does the certificate holder conduct flight training in airplane simulators or airplane flight training devices?

4.2.8

22. Does the certificate holder outsource crewmember training?

4.2.9

23. Does the certificate holder conduct Class II navigation using multiple long-range navigation systems? (and/or)

24. Does the certificate holder conduct operations in Central East Pacific (CEP) airspace? (and/or)

25. Does the certificate holder conduct operations in North Pacific (NOPAC) airspace? (and/or)

26. Does the certificate holder conduct operations in North Atlantic minimum navigation performance specifications (NAT/MNPS) airspace? (and/or)

27. Does the certificate holder conduct operations in areas of magnetic unreliability? (and/or)

28. Does the certificate holder conduct north polar operations?

5.1.7

29. Does the certificate holder conduct extended range operations with two-engine airplanes?

5.1.8

30. Does the certificate holder conduct operations in reduced vertical separation minimum (RVSM) airspace?

5.1.9

31. Does the certificate holder have maintenance performed within the United States?

6.2.1

RESERVED. Paragraphs 10-102 through 10-116.


Volume 10 Air Transportation Oversight System

CHAPTER 2 Procedures for Design and Performance Assessment

Section 2  Design and Performance Assessment Planning

Figure 10‑15, Module 2: Planning

Module 2. Planning flowchart10-117      Introduction. The planning process develops a risk‑based data collection plan to support design and performance assessments.

A.     The first step in this process is to assess the air carrier’s or applicant’s systems and operating environment for indications of hazards or conditions that may create hazards. This process helps to highlight any area on which to focus special oversight attention, and is used to prioritize the elements. The 28 risk indicators contained in the Air Carrier Assessment Tool (ACAT) help determine the special focus areas. (See Figure 10‑1.)

B.     The Comprehensive Assessment Plan (CAP) is a tool for planning, documenting, and tracking design assessments and performance assessments. The principal inspector (PI) or certification project manager (CPM) uses the CAP to adjust priorities and due dates of assessments, and to record the reasons for making adjustments.

10-118 Planning Data. (See flowchart process step 2.1.) The PI or CPM assembles a design or performance planning data package from the Safety Performance Analysis System (SPAS) and other sources. The operations research analyst (ORA) assists, as required. The planning data package provides data to consider when assessing an air carrier’s or applicant’s systems and operating environment for hazards. The data may be used to update the ACAT. The data are organized by risk indicator and by system, subsystem, and element. Another important source of planning data is input from Certificate Management Team (CMT) or Certification Project Team (CPT) members.

A.     Initial Certification Planning Meeting. The CPT holds an initial planning meeting following the formal application meeting to assess the applicant’s systems and operating environment. During the meeting, the CPT completes the initial ACAT and CAP using the application package and other data.

B.     Annual Planning Meeting. The CMT holds an annual planning meeting to (1) assess the air carrier’s systems and operating environment using the ACAT, (2) review the CAP, and (3) conduct recurrent training. This team‑based approach provides checks and balances. Other important goals for this meeting include building and improving team skills, establishing team norms, communicating CMT expectations, and sharing information. All CMT members should participate in the annual planning meeting whenever circumstances permit. Some members of the CMT may have to meet more often if they need to retarget planned data collection or collaborate on other oversight issues.

C.     Quarterly Planning Review. PIs conduct a quarterly planning review that determines if a changed condition for the air carrier can impact the ACAT or the CAP. In many cases, action by the PI is not necessary. The air carrier’s performance history, including the results of recent design and performance assessments, is accessible through ATOS data packages and the Assessment Determination and Implementation tool. Performance history should be reviewed quarterly.

10-119 Assess Air Carrier/Applicant Risk and Update the ACAT. (See flowchart process step 2.2.) PIs and CPMs use a series of risk indicators to identify conditions in an air carrier’s or applicant’s systems or operating environment that may create hazards. The risk indicators contain inspector consideration word pictures and a risk scale that guides the PI in completing the tool. The PI uses the results of the ACAT as an input to determine the priority and planned due date of design and performance assessments in the CAP.

A.     Risk Indicators. Risk indicators are groupings of safety‑ and/or performance‑related data that reflect areas of potential hazards and are used to prioritize air carrier oversight plans.

1)      Risk Indicator Categories. Risk indicators are divided into two major categories (System Stability and Operational Risks) designed to reflect the fact that internal and external events impact air carrier systems. The categories are subdivided into two subject areas. These subject areas focus the indicators on the operational, performance, and environmental risks most likely to impact an air carrier’s systems.
a)      The system stability category is divided into Operational Stability and Air Carrier Dynamics.

1.      Operational Stability—Organizational and environmental factors the air carrier cannot control directly, but can effectively manage to improve system stability and safety.

2.      Air Carrier Dynamics—Organizational and environmental factors the air carrier can directly control to improve system stability and safety.

b)      The operational risks category is divided into Performance History and Environmental Criticality.

1.      Performance History—Measures the results of the air carrier’s operations over time.

2.      Environmental Criticality—Aspects of the air carrier’s operating environment that may lead to or trigger a systemic failure.

2)      Risk Indicator Format. All risk indicators use a standard format. Each risk indicator contains a condition statement and background material, inspector consideration word pictures, a risk scale, references, possible data sources, and suggestions for determining the impact of that indicator on air carrier systems, subsystems, and elements. The complete set of risk indicators is at the end of this section.
3)      Inspector Considerations. The inspector considerations are word pictures for each risk indicator to provoke thought and aid assessment; they are neither exhaustive nor conclusive. A rating scale corresponds to the word pictures. The rating scale is designed so that 1 represents the lowest level of concern and 7 represents the highest level of concern. If the indicator does not apply, the lowest level of concern is assigned.

B.     Air Carrier Assessment Tool. The ACAT is an automated tool to record the PI’s or CPM’s assessment of elements using the risk indicators and calculate a risk score. The risk score is used to prioritize elements for planning certification and oversight activities.

1)      ACAT Frequency. The ACAT is completed during initial certification and updated when the air carrier certificate is issued. After that, the ACAT is undergoes an annual update, quarterly review, and a revision when there are any major changes to the risk indicators.
2)      ACAT Comments. CMT members may review and comment on the ACAT at any time throughout the year. During each quarterly review, the PI should consider CMT member comments.
3)      Completing the ACAT. The PI or CPM reviews the risk indicators to determine if any of the conditions apply to the air carrier or applicant. The PI or CPM selects the appropriate word picture and risk value after determining that the condition exists and that the risk indicator applies to one or more of the air carrier elements. An alternative way to complete the ACAT is to review each element and determine if any of the risk indicators apply to a particular element. Either way, the PI or CPM should explain the rationale for a selection. The PI or CPM can complete this review in one sitting or save the ACAT as a draft and return to it after reviewing and gathering more data. After completing the initial ACAT, the PI or CPM revises only those risk indicator values that have changed. After making the necessary changes, the PI or CPM finalizes and saves the ACAT.

10-120 Do the Results of the ACAT Affect Either Design or Performance? (See flowchart process step 2.3.)

A.     Design Assessment. The PI or CPM should evaluate the results of a modified ACAT to determine the impact on the air carrier’s system design. If the ACAT results affect design assessment, the PI or CPM determines whether the assessment plan baseline dates or priorities need adjustment.

Note:   The following only pertain to initial certification.

1)      Elements Related to FAA‑Approved Programs or Operations Specifications. Elements that pertain to FAA‑approved manuals, programs, or that are related to operations specifications are assigned the highest priority. Instructions for these SAIs will state that supplied data for each question must be verified by the CPT.
2)      Additional High‑Priority Elements. Additional elements that were identified as high priority in the CAP may be selected by the CPM. Instructions for these SAIs will state that the CPT must verify applicant‑supplied data.
3)      Remaining Elements. Instructions for all remaining elements will require, at a minimum, that the CPT verify applicant‑supplied data for the procedure attribute questions.

B.     Performance Assessment. If the ACAT results do not affect design, the PI or CPM determines if the ACAT results affect performance. If so, the PI or CPM should follow the process for planning performance assessment. If the ACAT affects neither design nor performance, adjusting the plan baseline dates or priorities may not be necessary.

10-121 Adjust Assessment Plan Due Dates and Priorities. (See flowchart process step 2.4.) The PI or CPM can adjust the design and performance assessment plan due dates and priorities based on the results of the ACAT or other factors.

A.     The Comprehensive Assessment Plan. The CAP is a 5‑year plan that for each element contains a planned design assessment and a planned performance assessment using audit‑based intervals to determine when an assessment is due.

1)      Baseline Intervals.
a)      Design Assessment. The baseline interval for all design assessments is once every 5 years. Once the initial design assessment for an element is complete, another assessment must be scheduled within the next 5 years.
b)      Performance Assessment. The baseline interval for performance assessments is once every 6 months for high‑criticality elements, once a year for medium‑criticality elements, and once every 3 years for low‑criticality elements. After completing the initial performance assessment for an element, another assessment must be scheduled within the applicable baseline intervals.
2)      Baseline Dates.
a)      Design Assessment. Baseline dates are automatically generated 5 years after the quarter in which the previous design assessment was completed. A design assessment is complete when the data have been collected and reviewed, and the bottom‑line assessment with appropriate action is documented.
b)      Performance Assessment. Depending on the element’s criticality, baseline dates are automatically generated 6 months, 1 year, or 3 years after the quarter in which the previous performance assessment was completed. A performance assessment is complete when the data have been collected and reviewed, and the bottom‑line assessment with appropriate action is documented.
3)      Due Dates. The due date and the baseline date are the same unless the PI or CPM makes a manual adjustment to the due date based on ACAT results or other factors. The PI or CPM must provide an explanation for adjusting the due date. Changing due dates may extend or reduce the interval between assessments.
4)      Priority. In the CAP, all of the applicable elements are initially sorted and displayed by their ACAT risk score value from highest to lowest. Each applicable element is also assigned a priority number starting at 1 (for the highest priority). The PI or CPM can manually adjust the priority of any element. The PI or CPM must provide an explanation for manually adjusting an element priority.

B.     Initial Certification. The initial CAP for an applicant contains a planned design assessment and performance assessment for each applicable element. The CPM uses the following guidance to develop the initial CAP.

1)      Adjustments to Comprehensive Assessment Plan Due Dates. Initially all design and performance assessments will have a due date in the quarter that the initial CAP is finalized. During the certification, the CPM manually adjusts the dates based on the applicant’s schedule of events. The baseline assessment plan for the CMT should divide assessments equally among the four quarters of each year.
a)      Design Assessment. After the applicant has been certified, the new CMT needs to establish a plan that balances the design assessment activities over a 5‑year period. The PI should plan to complete design assessments for about one fifth of the applicable elements each year.
b)      Performance Assessment. Performance assessments are balanced over a 3‑year period. The PI should plan to complete performance assessments for high‑criticality elements within 6 months, within one year for medium‑criticality elements, and should plan to complete about one third of the low‑criticality elements each year.
2)      Adjustments to Comprehensive Assessment Plan Priorities. The initial CAP displays the elements sorted by the risk score derived from the ACAT starting with the element that has the highest potential risk. The CPM adjusts the design assessment priorities and enters an explanation for the adjustment, based on the following factors:
a)      Design Assessment.

1.      Air Carrier Programs Requiring FAA Approval or Acceptance. Elements that are associated with air carrier programs that specifically require documentation in FAA‑approved or accepted manuals are assigned the highest priority by the CPM and are accomplished by the CPT.

2.      Design Assessments Completed by the Applicant. The applicant must complete a self audit using the SAI. In addition to the design assessments required by the preceding paragraph, the CPM determines which additional applicant self‑audited elements will be verified by the CPT completing a design assessment. All other elements will be reviewed and accepted.

b)      Performance Assessment. Air Carrier Programs Requiring Demonstration. The CPM assigns the highest priority to the air carrier programs with elements that specifically require demonstration in tabletop exercises, simulated evacuations, and proving tests. The CPT accomplishes reviews such demonstrations.

C.     Continued Operational Oversight. The CAP for an air carrier contains a planned design and performance assessment for each applicable element. The PI uses the following guidance to adjust the CAP.

D.    Adjustments to Comprehensive Assessment Plan Due Dates or Priorities.

1)      The PI can adjust the design assessment due date or priority for any element with an explanation of the reason for the adjustment.
2)      The PI may consider any of the following in determining if an adjustment is required:
a)      The results of a bottom‑line assessment for that element.
b)      The status of the element in relation to the 6‑month, 1‑year, or 3‑year performance assessment review cycle, or the 5‑year design assessment cycle.
c)      Any increase or decrease in the risk score over time.
d)      Assignment of design assessment completion dates based on the highest risk.
e)      New elements that are added as a result of major changes to air carrier operations.
f)        Work that was unassigned in the previous planning cycle that may have a higher priority level for the current planning cycle.

Note:   One of the fundamental principles of ATOS is that resources should always be assigned to performance and design assessments with the highest risk-based priorities.  This principle is intended to be applied to the PA and/or DA which are due within the quarter currently being planned.  It does not require adjusting due dates from the future quarters based solely on the ranking of an element.

10-122 Are There Other Factors That Affect Due Dates? (See flowchart process step 2.5.) The PI or CPM should consider any other factor that may impact the plan due dates and priorities. Upon determining that such a factor exists, the PI or CPM may adjust either the priority or the due date and include an explanation. The PI or CPM utilizes his or her knowledge and experience in making this assessment. Examples include:

A.     The PI knows that a planned change to a program is coming in a future quarter.

B.     National guidance directs the CMTs to perform a focused design or performance assessment.

10-123 Determine the Data Collection Requirement for Each Performance Assessment. (See flowchart process step 2.6.) After considering all factors that affect performance, the PI or CPM determines the data collection requirements for each performance assessment.

A.     Element Performance Inspection. The Element Performance Inspection (EPI) Data Collection Tool (DCT) is used for performance assessments. The PI or CPM needs to determine how many EPI DCTs should be completed to obtain the data needed to assess air carrier performance for each element. Typically, an inspector is assigned only one EPI for a specific performance assessment and he or she may complete multiple activities as part of that single EPI. PIs or CPMs may assign EPIs to more than one inspector in a performance assessment. For example, PIs or CPMs may want to assign EPIs to different inspectors for each aircraft fleet, for certain geographic areas, or for different training programs.

B.     Outsource Oversight Prioritization Tool. The Outsource Oversight Prioritization Tool (OPT) is used for planning surveillance of air carrier outsource providers. Principal avionics inspectors and principle maintenance inspectors must use this tool during oversight planning. It allows for prioritization of outsource maintenance providers to help determine specific data collection requirements. The OPT will assist the PI, other assigned inspectors, supervisors, and managers in identifying areas of concern or criticality about outsource providers and target resources toward the highest risk outsource maintenance providers.

1)      The OPT is also used as part of the Enhanced Air Carrier Outsourcing Maintenance and Repair Station Oversight System, along with the Repair Station Assessment Tool and the Repair Station Risk Management Process.
2)      The data resulting from the use of these tools resides in SPAS and may provide valuable information to help an air carrier PI plan data collection activities for outsourced maintenance.

C.     Off‑Hour Decision Aid. The Principal Inspector (PI) is required to know when to initiate the use of the Off-Hour Surveillance Decision Aid, and how to process the results. There does not need to be an hour-for-hour correlation between the amounts of work the air carrier does off hours and the off-hour surveillance. However, the emphasis needs to be on identifying potential issues related to the air carrier’s management and oversight of off-hour activities. This process and the associated decision aid will, through data collection and analysis, assist the PI in identifying potential risk associated with off-hour activities conducted by the air carrier.

1)      The Off-Hour Decision Aid instructs the user to match the capabilities of the air carrier to manage its off-hour activities or programs with a series of word pictures that address several dimensions. Each dimension results in a score, that when taken together, produces a scoring range that indicates the effectiveness of the operator’s off-hour activities. To properly complete the decision aid, the CMO/CMT will need to apply their knowledge of the carrier along with their assessment as to the prevalence and magnitude of the issues. The decision aid is designed to assist in these assessments and subsequent action planning.
2)      Examples of useful data may include the use of the EPI 1.3.7 for Outsource Maintenance and/or Element 1.3.4 for Required Items of Inspection (RII), and include in the special instructions the requirement to identify the amount and type of activities going on off hours at the repair station (or other outsource maintenance facility). A Constructed Dynamic Observation Report (ConDOR) may be preferred when more specific information about the potential hazard or risk. The PI may also contact the specific air carrier for information regarding their off-hour activities. This may include the types of activities occurring during off hours, location of off-hour activities, management and supervision of off-hour activities, and interface with outsource maintenance and other contracted activities.

10-124 Document Data Collection Requirements Using the Detailed Work Instruction. (See flowchart process step 2.7.) The PI or CPM documents the data collection requirements using detailed work instructions after considering all the factors affecting design or performance. For performance assessment, instructions help to ensure that timely, high‑quality data collection is focused in the areas the PI or CPM determines.

A.     Instructions Help Ensure Timely, High‑Quality Data. The CAP provides PIs with a plan that is tailored to the current oversight requirements for a specific air carrier.

B.     Instructions Prioritize Activities and Set Timelines for Start and Completion of the Activities by Certain Dates. The data collection period should be prior to the assessment due date to allow time for the PI or CPM to review the data and make an assessment.

C.     Resource Requirements. PIs and CPMs should provide specific instructions to assist the frontline manager in making assignments, including any special expertise, training, or background the inspectors assigned to the activity should have.

D.    Design Assessments. The PI or CPM determines if a team or individual inspector should complete the SAI.

E.     Performance Assessments. Instructions ensure that PIs or CPMs obtain the data they need to assess the element. Timely, high‑quality data are focused in the areas that the PI or CPM determines. PIs should provide specific instructions to ensure that activities are performed at appropriate locations and times for comprehensive and timely response to the questions. PIs or CPMs can use instructions to supplement or clarify DCT questions.

F.      Other Relevant Information That the Instructions May Include.

1)      Location of the air carrier system documentation.
2)      Focused area of assessment.
3)      Type of fleet if there are program differences.
4)      Distressed carrier, labor unrest, or other risk indicator information.
5)      Information obtained from the Assessment Determination and Implementation (ADI) tool.
6)      Any supplementary information the principal needs to make the assessment.

G.    Examples of Quality Work Instruction.

1)      Design Assessment.
2)      This SAI will start on or about (date). Perform at (airline name) HQ and the Certificate Management Office. Complete a comprehensive review of the DCT and the applicable (airline name) policy and procedures. The team coordinator will schedule and conduct a meeting to discuss coordination of activities. The (Airline name) point of contact will be determined prior to start of the assessment. Complete and save to the master record by April 30, 2005.
3)      Performance Assessment.
4)      This EPI will start on or about (date). Complete a comprehensive review of the DCT and review the previous SAI to become familiar with the applicable (airline name) policy and procedures. Perform activities at a minimum of five locations including ORD and MCO. Complete and save to the master record by (date).

10-125       Other Data Collection Requirements. It is possible that a design or performance assessment is not required, but that some additional data are necessary either to aid the decisionmaking process or to update the SAI question responses and comments. When appropriate, Constructed Dynamic Observation Reports (ConDOR) should be considered along with other planned oversight. The ConDOR may be appropriate in the following instances:

A.     To evaluate program, policy, or regulatory changes.

B.     To address focused or unique situations in response to local, regional, or national requirements.

C.     To collect targeted data for specific areas of immediate concern.

D.    As an action item in the RMP action plan.

E.     To document minor changes to the air carrier’s system (e.g., changes to the individual identified by the certificate holder as having responsibility and/or authority over the process).

F.      If the air carrier presents a revision to a manual that only changes a reference, a design assessment may not be necessary. The PI or CPM can generate a ConDOR instead of planning for a design assessment if a design assessment has already occurred.

10-126 Retargeted Comprehensive Assessment Plan. Throughout the year, the CMT collects, reviews, reports on, and analyzes performance data and environmental factors so PIs can make timely and consistent adjustments to the CAP.

A.     CMTs must obtain and continually monitor data for air carriers experiencing growth, financial distress, personnel reductions, labor unrest, and other organizational changes. The office manager or PI should not wait for formal notification of a problem before acting on identifying potential hazards and their associated risks.

B.     Reports such as the quarterly published Commercial Aviation System Profile, accessible on the SPAS homepage, may assist with this determination.

C.     If performance data or environmental factors indicate that the air carrier is experiencing significant changes in the operating environment that may affect its ability to balance resources, size, and organizational structure with operational requirements, then the CMT must use the Financial Condition Assessment Decision Aid and the Growth/Downsizing Assessment Decision Aid located in the Inspector’s Handbook to determine if it is necessary to retarget design assessments or initiate the ATOS RMP.

D.    Retargeting oversight is an integral part of the dynamic CAP. The PI determines any oversight retargeting requirement when oversight or external data identifies problems. Retargeting does not automatically delete or remove any information contained in the current CAP. It is not negative, nor does it mean the original CAP has flaws. It is normal for a CMT to retarget oversight several times a year based on the analysis of data or changing circumstances.

1)      Retargeting oversight allows for continuous update of the plan based on the analysis of data the CMT members collect. This should not be a response to CMT internal considerations such as staffing or budget constraints.
2)      The PI can retarget CAP oversight as often as necessary, but should not retarget on a calendar basis to close out a planning cycle. Continuously retargeting oversight to the same elements within a CAP upon identifying performance deficiencies is ill‑advised. Deficiencies warrant conducting a design assessment for those elements.
3)      If oversight retargeting is appropriate to focus additional resources in an area of concern, the PI must determine which elements of the ACAT are related to the area of concern. The PI can retarget oversight for the entire air carrier or for selected systems, subsystems, and elements.

Figure 10‑16, Risk Indicator Quick Reference Guide

Risk indicators are a first step to identification of hazards that should be addressed by the air carrier.

ENVIRONMENTAL CRITICALITY (EC)—Aspects of the air carrier’s surroundings that may lead to or trigger a systemic failure with the potential of creating an unsafe condition.

EC‑01

Age of Fleet

The age of an air carrier’s fleet can have an impact on its systems.

EC‑02

Varied Fleet Mix/Configuration

A varied fleet mix and/or mixed fleet configuration can significantly alter an air carrier’s safety profile and the potential for failure in its systems.

EC‑03

Change in Aircraft Complexity

Changes to the complexity of an air carrier’s fleet can significantly affect an air carrier’s safety and the potential for failure in its systems.

EC‑04

Outsource (Maintenance, Training, Ground Handling)

The use of outsourcing programs, depending upon a number of factors, could heighten the risks associated with various air carrier operations. These programs must be effectively managed.

EC‑05

Seasonal Operations

Short-term operations may present their own unique risk(s) and may require attention and preparation by the air carrier.

EC‑06

Relocation/Closing of Facilities

Relocating or closing a facility may adversely affect operational and system stability at an air carrier.

EC‑07

Lease Arrangements

Aspects of lease arrangements may be sources of risk at air carriers and must be effectively managed.

EC‑08

Off‑hours Activity

Air carrier management of off‑hour (i.e., as outside normal FAA hours, including weekends) activity can be prone to risk.

PERFORMANCE HISTORY (PH)—Results of the air carrier’s operations over time.

PH‑01

Enforcement Actions

Enforcement actions can help identify the air carrier’s safety profile and any area of risk in its systems.

PH‑02

Accidents/Incidents/Occurrences

Data regarding accidents, incidents, and occurrences may provide insights into areas of risk at an air carrier.

PH‑03

Department of Defense Audits

Department of Defense audit findings help to identify hazards and their associated risks. Audit data may provide insights into systemic problems in the design and performance of an air carrier’s systems.

PH‑04

Self Disclosures

The type and content of an air carrier’s self disclosures, and the effectiveness of the air carrier’s corrective actions can assist in risk assessment.

PH‑05

Safety Hotline/Complaints

Excessive or repetitive safety hotline and other complaints against an air carrier may assist in identifying and isolating areas of risk. Complaints can aid the air carrier in managing and controlling corrective and followup actions.

PH‑06

Voluntary Programs Data

Air carrier voluntary program data may be useful for hazard or risk identification. Such data can aid the air carrier in managing corrective and followup actions.

PH‑07

Surveillance Indicators

Surveillance data from the Safety Performance Analysis System, Program Tracking and Reporting Subsystem, and the Air Transportation Oversight System help to identify trends in air carrier performance and can assist with identifying risks in an air carrier’s system design.

OPERATIONAL STABILITY (OS)—Organizational and environmental factors the air carrier cannot directly control, but can manage effectively to improve system stability and safety.

OS‑01

Key Management SPAS Indicators

Changes in key management personnel can significantly impact an air carrier’s system and operational stability.

OS‑02

Financial Conditions

Air carriers that experience adverse financial conditions may have higher risk.

OS‑03

Change in Air Carrier Management

Changes in management personnel other than key management can significantly impact an air carrier’s system and operational stability.

OS‑04

Turnover in Personnel

A high turnover of operations or maintenance personnel can dramatically increase the potential for risk in an air carrier’s systems.

OS‑05

Reduction in Workforce

A reduction in the air carrier’s workforce can dramatically increase the potential for failure in an air carrier’s systems.

OS‑06

Rapid Growth/Downsizing

Times of significant change such as rapid expansion or downsizing can impact air carrier operations due to the possible misalignment of resources and operational requirements.

OS‑07

Merger or Takeover

Air carriers must effectively manage mergers or takeovers to ensure continued compliance and safe operating practices.

OS‑08

Labor–Management Relations

A poor or deteriorating labor–management relationship can create risk.

AIR CARRIER DYNAMICS (CD)—Organizational and environmental factors that the air carrier can directly control to improve system stability and safety.

CD‑01

New/Major Changes to Program

Safety issues may develop from new or changed programs and may increase the potential for noncompliance with existing processes and controls.

CD‑02

Continuing Analysis and Surveillance System (AW Only)

Air carriers with a poorly functioning Continuing Analysis and Surveillance (CAS) System can overlook and improperly manage increased levels of risk.

CD‑03

Safety Management

Air carriers who do not have a safety management system may not understand or adequately control hazards to operational safety.

CD‑04

Relationship with the FAA

The air carrier’s relationship with its assigned Federal Aviation Administration personnel may provide insights into the air carrier’s compliance posture and safety culture.

CD‑05

Human Factors

Risk may exist due to human lapses in the air carrier’s design and/or performance.

 


Figure 10-17, EC‑01 Age of Fleet

Condition: The age of an air carrier’s fleet can have an impact on its systems.

Background: Currently, the average jet in the U.S. commercial fleet is 13 years old. From the FAA’s perspective, aging aircraft are defined as aircraft of any make or model that are 15 years or older. Many aircraft of the current U.S. commercial fleet are considered aging aircraft. The age of the fleet must be considered for developing a surveillance plan.

Risk Score

Inspector Considerations

1–2

The air carrier does not have an aging fleet. The air carrier’s procedures, controls, interfaces and process measurements appear to adequately address the requirements of aging fleet.

3–5

Concern exists regarding the air carrier’s ability to manage an aging fleet due to considerations such as:

·         Inadequate management of operational risk associated with aging aircraft.

·         Inadequate aging aircraft identification process.

·         Ineffective internal and/or external communications flow regarding the maintenance and operational requirements of aging fleet.

·         Inadequate corrective action plan.

·         Air carrier maintenance program and related infrastructure inadequate to meet the enhanced requirements associated with aging aircraft.Air carrier has aircraft just approaching the time line for aging aircraft and dealing with aging aircraft is a new or unknown component of their profile.

6–7

Concern exists that the air carrier does not have a process to identify and evaluate the aging aircraft in its fleet. The air carrier has not demonstrated appropriate or effective management of the situation. Concern exists that the air carrier’s processes to manage an aging aircraft fleet appear to be inadequate at most levels.

Guidance References: Not Applicable

Data Sources: Aircraft Evaluation Group, operations specifications, air carrier and industry fleet lists, ASIAS, SPAS

Impact on ATOS System, Subsystem, and Elements: ATOS systems contain elements that may pertain to aging aircraft, particularly in the airworthiness systems, subsystems, and elements. Focus on the area of concern and attempt to relate it to the affected elements in these systems. Remember to consider interfacing and supporting elements in the maintenance system as applicable.

 


Figure 10‑18, EC‑02 Varied Fleet Mix/Configuration

Condition: A varied fleet mix and/or mixed fleet configuration can significantly alter an air carrier’s safety profile and the potential for failure in its systems.

Background: A varied and/or mixed fleet exists when an air carrier fleet includes: (1) a variety of different aircraft types; (2) a mix of models and/or series of the same type the same fleet; (3) multiple types within the same fleet; and (4) a number of different configurations of the same make and model within the same fleet.

Many established carriers have long operated a varied fleet mix and/or mixed fleet configurations. The implications for operating this type of fleet may be more significant for new entrant carriers, where resources and infrastructure may be a major consideration.

Risk Score

Inspector Considerations

1–2

The air carrier does not have a varied fleet mix and/or mixed fleet configuration. The impact of varied fleet mix and/or mixed fleet configuration appears to be effectively managed by the air carrier’s procedures, controls and process measures. Any adverse impact appears negligible.

3–5

Concern exists regarding the impact of the varied fleet mix and/or mixed fleet configuration on the air carrier due to considerations such as:

·         Inadequate management structure to handle the impact of a varied fleet mix and/or mixed fleet configuration.

·         Potential for adverse impact on air carrier’s maintenance program.

·         Potential for adverse impact on air carrier’s operations program.

·         Potential for adverse impact on air carrier’s training program.

·         Air carrier lacks necessary system controls for the varied fleet mix and/or mixed fleet configuration.

·         Air carrier has little or poor performance history with varied fleet mix and/or mixed fleet configuration.

6–7

Concern exists that the air carrier does not have the overall infrastructure in place to support a varied fleet mix and/or mixed fleet configuration. The air carrier has not demonstrated appropriate or effective management.

Guidance References: New Aircraft Process Document (NAPD).

Data Sources: Operations specifications, company manuals, air carrier and industry fleet listing, Vital Information Subsystem (VIS) data, past surveillance findings, internal audit findings, human factor indications (identified weaknesses of the controls attribute during surveillance of an element), Mechanical Interruption Summary, air carrier crewmember irregularity reports.

Impact on ATOS System, Subsystem, and Elements: Operating this type of fleet can cut across all aspects of an air carrier’s operations: training, maintenance, human factors, ground‑handling operations, etc. When considering the impact on the ATOS model, consider the operational area of the concern, and tie it to the ATOS system(s), subsystem(s), and element(s) most closely related with the concern. Consider also related training manuals and management areas, and the risk impact of the varied/mixed fleet on these areas.

 


Figure 10‑19, EC‑03 Change in Aircraft Complexity

Condition: Changes to the complexity of an air carrier’s fleet can significantly affect an air carrier’s safety and the potential for failure in its systems.

Background: Complex aircraft generally incorporate more sophisticated technology. Innovative technology can increase or decrease the potential for noncompliance with existing processes and controls. A change in aircraft means that the carrier may need processes and procedures to support differences in both manual and automated complexities and technology. New aircraft technology can impact the air carrier infrastructure because of additional requirements for training, operations, and maintenance.

Risk Score

Inspector Considerations

1–2

The air carrier is not currently experiencing a change in aircraft complexities or technologies. The impact of aircraft complexity and technology appears to be effectively managed by the air carrier’s procedures, controls, and process measures. Any adverse impact appears negligible to nonexistent.

3–5

Concern exists regarding the impact of change in aircraft complexity due to inadequate infrastructure considerations such as:

·         Controls and process measures may be inadequate to manage the changes in operations and maintenance.

·         Air carrier personnel and others have little experience with the new technologies and/or the operational environment.

·         Dramatic changes in operating environment due to changed aircraft complexity.

·         Ineffective or no air carrier safety audits to assess the effectiveness of the air carrier’s training programs and operation within the air carrier environment.

·         Equipment/logistical support for new technology may be lacking.

·         Air carrier has little or poor performance history with incorporating complex aircraft.

6–7

Concern exists about adverse impact on system design or performance due to the change in aircraft complexity. The air carrier has not demonstrated appropriate or effective management of the situation.

Guidance References: FAA Order 8900.1, volume 3, chapter 9; Order 8900.1, volume 3, chapters 29 and 30; 8900.1, volume 6, chapter 2, section 18, Evaluation of Air Carrier’s Management of Significant Changes.

Data Sources: Aircraft Evaluation Group, aircraft manufacturer type‑certificate data sheets, FSB reports, CAS reports, MIS reports/SDRs, reliability reports, training records, line checks, Air Carrier Delay Summaries

Impact on ATOS System, Subsystem, and Elements: Aircraft configuration control, flight operations, and personnel training and qualifications ATOS systems all contain elements that may pertain to changes in aircraft complexity. Focus on the area of concern and attempt to relate it to the affected elements in these systems. Remember to consider interfacing and supporting elements in the Technical Administration and Manual systems as applicable.


Figure 10‑20, EC‑04 Outsource (Maintenance, Training, Ground Handling)

Condition: The use of outsourcing programs, depending upon a number of factors, could heighten the risks associated with various air carrier operations. These programs must be effectively managed.

Background: The aviation industry increasingly outsources traditional air carrier functions to independent contractors. Outsourcing has developed to the point where multiple levels of contractors could be involved in providing the service. Areas for outsourcing might include the maintenance function, the ground handling function, and training programs.

Risk Score

Inspector Considerations

1–2

The air carrier does not outsource. The air carrier’s oversight staffing and audit functions appear to be adequate to include the outsourced functions. The contractor(s) effectively meet the training requirements of the air carrier and appear to be qualified for the outsource function(s). The air carrier effectively manages impacts.

3–5

Concern exists regarding the impact of outsourcing due to considerations such as:

·         The contract personnel are utilized by numerous air carriers, increasing the possibility of non‑adherence to procedures.

·         The contract personnel training records are inaccurate.

·         Adverse Department of Defense (DOD) findings against the contractor.

·         Contractor qualifications and abilities (maintenance, training, and/or ground handling) are in question.

·         The air carrier frequently changes contractors based on economics.

—and/or—

·         The use of outsourcing for all or particular functions is relatively new at the carrier; therefore, lack of historical data is a major consideration.

6–7

Concern exists about the impact of outsourcing because the air carrier does not have an effective safety audit function to monitor the performance of the contractors. Concern exists because the contractor’s performance history indicates multiple, repeated safety violations.

Guidance References: FAA Order 8900.1, volume 3, chapter 49, Supplemental Information SAI 1.3.11 and EPI 1.3.11, SAI 1.3.7 AW, EPI 1.3.7 AW, SAI 4.2.9 OP, EPI 4.2.9 OP.

Data Sources: Contractor enforcement history, training records, IOE records, SPAS Repair Station Reports, DOD audit findings, air carrier past performance including element 1.3.7 AW and/or 4.2.9 OP.

Impact on ATOS System, Subsystem, and Elements: Evaluate if the risk identified is related directly to the dedicated ATOS elements 1.3.7 AW and/or 4.2.9 OP. Compare the functions outsourced with the structure of the ATOS air carrier system, subsystem, and element model. Choose the element(s) whose performance measures and design most closely reflects the outsource work functions where risk has been identified. Evaluate across systems for elements where associated risk may also be present.


Figure 10‑21, EC‑05 Seasonal Operations

Condition: Short-term operations may present their own unique risk(s) and may require attention and preparation by the air carrier.

Background: Air carriers perform seasonal operations during a particular season or time of year to satisfy a short‑term need. Seasonal operations, while limited in nature, require as much or more preparation and attention to the quality and safety of the services provided as regular operations. (For example: air carriers who normally operate in a warm environment may engage in seasonal operations that occur during the winter months when flying to and from ski resort areas. They must be prepared to manage aircraft deicing, training, and all of the associated requirements. If the air carrier does not normally fly this route, deicing may not be part of its regular operations.) Understanding the type and scope of the seasonal operation, as well as the impact on the air carrier’s systems, is critical in rating this indicator.

Risk Score

Inspector Considerations

1–2

The air carrier does not conduct seasonal operations. Procedures, controls, and process measures appears to effectively manage the impact of seasonal operations on the infrastructure of the air carrier systems. Any adverse impact appears to be negligible to nonexistent.

3–5

Concern exists regarding the air carrier’s ability to conduct seasonal operations due to considerations such as:

·         Air carrier has no past experience with seasonal operations.

·         The air carrier’s procedures, controls, and process measures do not effectively manage the impact of seasonal operations its infrastructure.

·         Air carrier has encountered problems with seasonal operations in the past.

·         Air carrier does not conduct safety audits of its seasonal operations.

·         Incomplete integration of air carrier’s core business functions to the seasonal operation.

6–7

Concern exists that the air carrier is unable to manage the risks associated with the temporary change in its operational environment due to the seasonal operation, and controls and procedures do not support the operation or do not exist at all. The air carrier has not demonstrated appropriate or effective management of the situation.

Guidance References: Not Applicable.

Data Sources: OpSpecs; SPAS data; ATOS data repository; discussion with air carrier personnel

Impact on ATOS System, Subsystem, and Elements: Based upon the scope and type of seasonal operation(s) conducted and inspector assessment of possible location(s) for risk, identify corresponding ATOS system(s), subsystem(s), and element(s) which relate to that risk. Consider other system(s) that support the seasonal operation and might be burdened. Focus also on the elements dealing with training and management support for these areas.


Figure 10‑22, EC‑06 Relocation/Closing of Facilities

Condition: Relocating or closing a facility may adversely affect operational and system stability at an air carrier.

Background: Air carrier change to facilities could include adding a new facility, closing a facility, and/or moving an existing facility to another site. A misalignment of resources to requirements surrounding the change should be of concern and may indicate safety risk.

Risk Score

Inspector Considerations

1–2

The air carrier has not experienced any changes/relocation of facilities. Air carrier changes/relocations of facilities do not appear to disrupt or adversely impact the quality and safety of ongoing operations. The air carrier’s system, process measures and controls, are both effective and undertaken correctly. The air carrier has demonstrated appropriate and effective management of events.

3–5

Air carrier changes/relocations of facilities may not be effectively managed. Concern exists due to considerations such as:

·         Air carrier already has poor past performance history with regard to relocation and closing of facilities.

·         Air carrier has not effectively managed changes to its facilities

·         Rate and pace of change is inappropriate.

·         Change is not steady, not implemented over time, and not accompanied by appropriate training, documentation, and manual changes.

·         Carrier may not have adequate resources and training to support the change.

·         Background and experience of personnel at new facility in question.

6–7

Concern exists due to adverse impact of air carrier changes/relocations of facilities which may disrupt the quality and safety of ongoing operations. The air carrier has not demonstrated appropriate or effective management of the event(s).

Guidance References: FAA Order 8900.1, volume 7, chapter 1, section1, Accident Investigations; 8900.1, volume 8, chapter 5; section 6, Process Service Difficulty Report; 8900.1, volume 7, chapter 1, sections 2, Incident Investigations and Occurrences.

Data Sources: SPAS data packages, FAA Accident Investigation Records, Investigation of Pilot Deviation Reports, Accident/Incident Corrective Action Records, Aircraft Accident/Incident Preliminary Notices, National Transportation Safety Board findings/reports, MIS reports, SDR reports, DCT 1.2.4 “Mechanical Interruption Summary Reports”


Impact on ATOS System, Subsystem, and Elements: If root cause(s) and/or failure of event(s) is generally known and validated, consider the system(s), subsystem(s), and element(s) in the ATOS structure that relate to the failure or root cause. Consider also the ATOS Training and Technical Administration systems. If the air carrier has not effectively and appropriately managed the events, consider also the ATOS Manuals and Technical Administration systems.


Figure 10‑23, EC‑07 Lease Arrangements

Condition: Aspects of lease arrangements may be sources of risk at air carriers and must be effectively managed.

Background: A lease is any agreement by a person (the lessor) to provide an aircraft to another person (the lessee) who will use the aircraft for compensation or hire purposes. These arrangements are increasingly used to meet market demands and seasonal operations. The variety of leasing arrangements entered into by an air carrier can have a significant impact on its maintenance, training, and operations programs and its overall safety. (wet lease, dry lease, interchange agreements). Interchange agreements can have a major impact on normal carrier operations; therefore, special attention during surveillance may be warranted when an air carrier is a party to this type of arrangement.

Risk Score

Inspector Considerations

1–2

The air carrier is not presently involved in any lease arrangements. The impact of any lease arrangements appears to be effectively managed by the air carrier’s procedures, controls and process measures. The air carrier has demonstrated appropriate and effective management of any events.

3–5

Concern exists about the impact of the lease arrangements due to considerations such as:

·         Perceived difficulty of exercise of operational control.

·         Air carrier audit functions fail to account for special circumstances of lease agreement.

·         Changes to flight and ground crew training to support the lease arrangement are broad.

·         Lack of sound controls and interfaces in new procedures required by lease agreement.

6–7

Concern exists regarding high adverse impact of the lease arrangement on air carrier systems, subsystems and elements. In the event of a wet lease arrangement, air carrier loss of operational control may be a concern. The air carrier has not demonstrated appropriate or effective management of the situation.

Guidance References: FAA Order 8900.1, volume 3, chapter 13, section1, Evaluate Aircraft Lease/Interchange Agreement; 8900.1 volume 3, chapter 46; 8900.1, volume 12, chapter 2.

Data Sources: Lease agreement contract, air carrier promotional materials, Operations Specifications

Impact on ATOS System, Subsystem, and Elements: Lease agreements can affect most operational areas. Based on the nature of the lease arrangement and the concerns assessed, associate the corresponding ATOS system(s), subsystem(s), and element(s).

 


Figure 10‑24, EC‑08 Off‑Hours Activity

Condition: Air carrier management of off‑hour (i.e., as outside normal FAA hours, including weekends) activity can be prone to risk.

Background: Oversight of 14 CFR part 121 air carriers requires insight into the entire range of activities performed by the air carrier. Since much of that activity is conducted during times that are not normal duty hours for FAA inspectors, the certificate‑holding district office (CHDO) must make a conscious effort to ensure that the air carrier has the means and methods to ensure that procedures are being followed and that the air carrier continues to be in regulatory compliance at all times. Potential factors that could contribute to problems during off‑hour activities include the possibility of diminished FAA presence; less‑than‑effective air carrier managerial oversight; unavailability of expert advice or other supporting resources (help desks, vendor support, etc.); and incomplete exchange of information during shift change, personnel fatigue, and a range of other potential problems. The Off-Hour Surveillance Decision Aid is designed to assist in identifying risk. Consider that while some components of the decision aid scoring are primarily geared toward airworthiness activities, an assessment can also be made for flight/ground operations in conjunction with the decision aid. An example of off-hours air carrier activity in the air carrier flight/ground operations arena might be training that is conducted during off hours (midnight shift) or flight operations primarily conducted outside of normal FAA duty hours (e.g., overnight cargo operations).

Risk Score

Inspector Considerations

1–2

Off‑hours activity appears to be effectively managed by the air carrier’s procedures, controls, and process measures. Any adverse impact appears to be negligible to nonexistent.

—and/or—

Off-Hour Surveillance Decision Aid score of 57–80.

3–5

Concern exists about air carrier management of components of its off‑hour activity for considerations such as: (1) adequacy of any training for off‑hours personnel; (2) effectiveness of change‑over procedures; and (3) amount of off‑hours activity performed.

—and/or—

Off-Hour Surveillance Decision Aid score of 41–56.

6–7

Concern exists regarding the air carrier’s ability to manage off‑hours activity on a system level.

—and/or—

Off-Hour Surveillance Decision Aid score of 8 to 40.

Guidance References: FAA Order 8900.1, volume 10.

Data Sources: Off-hour Surveillance Decision Aid, Comprehensive Assessment Plan notes, interviews with air carrier management and personnel, company maintenance records, flight schedules, employee duty and rest records, CAMI Human Factors reports

Impact on ATOS System, Subsystem, and Elements: Based upon the notes which indicate the CMT’s collection of type and amount of off‑hour activity accomplished at the air carrier, associate the appropriate ATOS system, subsystem and element structure with the risk concern(s) identified. If the Off-Hour Surveillance Decision Aid was used, focus on the components of the aggregate scoring with the most adverse scores, and relate them to the ATOS system, subsystem, and element model.

 


Figure 10‑25, PH‑01 Enforcement Actions

Condition: Enforcement actions can help identify the air carrier’s safety profile and any area of risk in its systems.

Background: Enforcement actions are the reported results of any administrative and/or legal enforcement that the FAA has taken against an air carrier and/or certificated personnel in response to regulatory noncompliance. A trend analysis is an important factor in assessing this risk indicator, as it can identify deteriorating systems and programs.

Risk Score

Inspector Considerations

1–2

The air carrier has no recent enforcement activity. The air carrier’s enforcement history does not appear to indicate failure in system controls and/or an adverse change in the safety profile. Any enforcements have been low-criticality and not of a repetitive nature.

3–5

The air carrier’s enforcement history may indicate stresses in system controls and/or an adverse change in its safety profile. Concern exists due to considerations such as:

·         Enforcements in critical areas.

·         Enforcements are repetitive in nature.

·         Enforcements appear to identify a negative trend.

·         Air carrier failure to initiate adequate corrective action and followup processes.

6–7

The air carrier’s enforcement history appears to indicate failure in system controls and/or a rapid and large‑scale deterioration in safety profile. The air carrier has not demonstrated appropriate or effective management of the event(s). Concern exists due to enforcements of high criticality, or multiple enforcements in repeated areas, or across the air carrier’s systems.

Guidance References: FAA Order 2150.3; 8900.1, volume 11, chapter 3, section 1, Program Overview; 8900.1 volume 7, chapter 7; 8900.10, volume 14, chapter 2, section1, Compliance and Enforcement Special Consideration—Air Carrier; 8900.1 volume 11, chapter 3, 8900.10 volume 12, chapter 2.

Data Sources: SPAS, ASIAS, EIS.

Impact on ATOS System, Subsystem, and Elements: Consider the area(s) of the air carrier’s operation where the regulatory noncompliance(s) caused the enforcement action(s). Review the ATOS system, subsystem, and element structure for corresponding loci for the risk. Consider related training and management issues that may also contribute to root cause, as well as failures in interfacing systems. If it is determined that air carrier followup and corrective action for enforcements have been ineffective, evaluate the Key Personnel and/or Manual Management subsystem(s).


Figure 10‑26, PH‑02 Accidents /Incidents/Occurrences

Condition: Data regarding accidents, incidents, and occurrences may provide insights into areas of risk at an air carrier.

Background: To be most effective, this data should be analyzed in conjunction with the air carrier’s response, corrective action planning, and ongoing followup activities. Collectively, this information may provide a point‑in‑time measurement of the air carrier’s performance. Repeated events could be an indication of management and air carrier systems’ inability to resolve issues and implement corrective actions appropriately.

Risk Score

Inspector Considerations

1–2

The air carrier has not experienced any accidents, incidents, or occurrences.

Any occurrences/incidents were minor. The air carrier has demonstrated appropriate and effective management of any events.

3–5

Concern exists regarding the air carrier’s accident, incident, occurrence history due to considerations such as:

·         Repeated events in interfacing areas.

·         Increasing number of events.

·         Criticality of event(s).

·         Root cause(s) of the event(s) or inadequate analysis of root cause.

·         Inadequate air carrier controls.

·         Inadequate air carrier followup to event(s).

6–7

Concern exists because the air carrier has experienced accident(s) or incident(s) that appear to indicate a negative trend with escalation in severity. The air carrier has not demonstrated appropriate or effective management of the event(s).

Guidance References: FAA Order 8900.1, volume 7, chapter 1, section1, Accident Investigations, and section 2 Incident Investigations and Occurrences; 8900.1, volume 8, chapter 5, section 6, Process Service Difficulty Report; 8900.1, volume 7, chapter 1, sections 1, Accident Investigations, and 2, Incident Investigations and Occurrences .

Data Sources: SPAS Data packages, FAA Accident Investigation Records, Investigation of Pilot Deviation Reports, Accident/Incident Corrective Action Records, Aircraft Accident/Incident Preliminary Notices, NTSB findings/reports, MIS reports, SDR reports, DCT 1.2.4 “Mechanical Interruption Summary Reports”

Impact on ATOS System, Subsystem, and Elements: If root cause(s) and/or failure of event(s) is generally known and validated, consider the system(s), subsystem(s), and element(s) in the ATOS structure that relate to the failure or root cause. Consider also the ATOS Training and Technical Administration systems. If the air carrier has not effectively and appropriately managed the events, consider also the ATOS Manuals and Technical Administration Systems.



Figure 10‑27, PH‑03 Department of Defense Audits

Condition: Department of Defense (DOD) audit findings help to identify hazards and their associated risks. The audit data may provide insights into systemic problems in the design and performance of an air carrier’s systems.

Background: The DOD Air Carrier Survey and Analysis Team monitors certificate holders that do business with the DOD. The DOD regulations, directives, and Commercial Air Carrier Quality and Safety requirements form the basis for the DOD surveillance auditing process. The audit is conducted every 2 years and is documented on a structured Air Carrier Operations Survey Checklist. While the structure of the DOD surveillance auditing process varies from the FAA process, the results provide a unique view of the air carrier as DOD is often an airline’s largest customer and the audits are a requirement of the contract between the DOD and the certificate holder.

Risk Score

Inspector Considerations

1–2

The air carrier is not a DOD carrier. There are no significant adverse DOD findings against the air carrier.

3–5

DOD has placed carrier on Close Watch program for safety issues or has indicated other concerns for safety issues and/or DOD has recently recertified carrier after temporary non‑use status.

—or—

Timing interval since the last complete DOD survey is longer than standard two years and/or scope of that survey not complete.

6–7

Air carrier is currently removed from the DOD list of qualified air carriers. Air carrier is currently on temporary non‑use status for safety issues.

Guidance References: Not Applicable

Data Sources: SPAS, DOD inspection reports, DOD liaison, air carrier personnel.

Impact on ATOS System, Subsystem, and Elements: Inspector should evaluate and focus on those ATOS elements that correspond or are related to the areas of adverse DOD findings. Determine which aspects of the systems are affected by the adverse DOD findings. Further determine what these impacts might mean in terms of additional surveillance requirements.


Figure 10‑28, PH‑04 Self Disclosures

Condition: The type and content of an air carrier’s self disclosures, and the effectiveness of the air carrier’s corrective actions can assist in risk assessment.

Background: Self‑disclosures are intended to provide the air carrier with a means to generate safety information that may not be captured through the traditional reporting mechanisms. The details of the program are documented in Voluntary Disclosure Reporting Program, AC 00‑58, current edition. The self‑disclosure process provides the air carrier and its employees with a means by which they can disclose information and identify possible violations of 14 CFR. Self‑disclosure of this type of information may be a positive indication of the air carrier’s commitment to addressing safety problems and proactively identifying potential safety hazards. It may also be a positive indication of the air carrier’s emphasis on safety and willingness to better manage its safety profile. Self‑disclosure of problems by the air carrier to the FAA can also heighten the trust that exists between the two entities and is a visible demonstration of cooperation. Trust and cooperation between air carrier and FAA personnel can have a positive impact on quality and safety.

Risk Score

Inspector Considerations

1–2

There have been no recently filed self‑disclosures involving the air carrier. Self‑disclosures appear to have been effectively handled as an integral air carrier system component and the corrective action has augmented the air carrier’s safety profile.

3–5

Concern exists due to considerations such as:

·         Ineffective self‑disclosure process at the air carrier.

·         Poorly documented procedures at the air carrier for self‑disclosure process.

·         Management does not encourage use of self‑disclosure process.

·         Self‑disclosure process has not reduced problems or violations.

·         Implementation of controls following self‑disclosure inadequate or implemented controls unacceptable.

·         Flow of safety information surrounding self‑disclosure is inadequate.

·         Company employees unaware of self‑disclosure process.

6–7

Concern exists because there are multiple repeated self‑disclosures in high criticality areas and continued failure on the part of the carrier in implementing corrective and followup action. Concern exists because the air carrier’s self‑disclosures appear to indicate a negative trend with escalation in severity.

Guidance References: AC 00‑58, current edition; FAA Order 8900.1 volume 1, chapter 5.

Data Sources: FAA records, air carrier self‑disclosure packages, SPAS

Impact on ATOS System, Subsystem, and Elements: Focus on the subject area and nature of the self‑disclosure(s). Drill down to the root cause(s) if possible, and relate it to the system, subsystem, element structure of the ATOS model. Evaluate interfacing and supporting systems, such as training or personnel. If the risk concern includes the lack of effectiveness of the air carrier’s corrective action, consider focusing also on the Manual Management and Key Personnel subsystems.


Figure 10‑29, PH‑05 Safety Hotline/Complaints

Condition: Excessive or repetitive safety hotline and other complaints against an air carrier may assist in identifying and isolating areas of risk. Complaints can aid the air carrier in managing and controlling corrective and followup actions.

Background: This indicator considers recorded charges of dissatisfaction brought by consumers, employees, vendors, other air carriers, and members of Congress against the air carrier. Complaints received by these entities, which are related to the air carrier or aircraft operations, maintenance, quality, stability, compliance or safety may affect surveillance planning. Any type of complaint information, and actions taken as result of a complaint, provides an external view of how consumers and industry peers perceive the air carrier. This perspective may be of value during risk assessment and surveillance planning.

Risk Score

Inspector Considerations

1–2

There have been no safety hotline/complaints against the air carrier. Any complaints have been relatively minor and do not appear to point to system weakness or human factors issues.

3–5

Concern exists due to considerations such as:

·         Multiple complaint data points in same area(s).

·         Single complaint in highly critical area of the air carrier’s operation.

·         Air carrier complaint resolution history has indicated that air carrier does not have a strong corrective action plan and process for assessing, categorizing and handling complaints.

6–7

Concern exists because there are multiple, repeated complaints in the same area revealing risk in a system, subsystem or element that appear to indicate a negative trend with escalation in severity. The air carrier has not demonstrated appropriate or effective management of the event(s).

Guidance References: FAA Order 8900.1, volume 7, chapter 7; 8900.1, volume 11, chapter 3, section 1, Program Overview, section 2, Inspector Responsibilities and Procedures, and section 3, Complaint Processing.

Data Sources: FAA safety hotline data, Congressional inquiries and/or other complaints lodged against the carrier, DOT complaint statistics, comparable surveillance data, SPAS, PTRS reporting source codes for hotline and whistleblower programs.

Impact on ATOS system, Subsystem, and Elements: Focus on the subject area and nature of the complaint(s). Drill down to the root cause(s) if possible, and relate it to the system, subsystem, element structure of the ATOS model. Also evaluate interfacing and supporting systems, such as training or personnel. If the risk concern includes the lack of effectiveness of the air carrier’s corrective action, consider focusing on the Manual Management and Key Personnel subsystems.


Figure 10‑30, PH‑06 Voluntary Programs Data

Condition: Air carrier voluntary program data may be useful for hazard or risk identification. Such data can aid the air carrier in managing corrective and followup actions.

Background: The FAA encourages air carrier participation in voluntary programs. Many of these programs provide unparalleled data opportunities, and therefore more effective opportunities for risk identification and control. The FAA will pursue harmonization of these programs and their integration with ATOS. These programs include but are not limited to Internal Evaluation Programs (IEP), Aviation Safety Action Programs (ASAP), Flight Operational Quality Assurance (FOQA), Voluntary Disclosure Reporting Program (VDRP) or Independent Flight Safety programs.

Properly managed, these programs help move safety management from the reactive mode to a data‑driven, proactive mode that continuously searches for accident precursors. Data sharing, collaboration and open communication optimize the functioning of the oversight system and leverage resources to advance safety. If the air carrier does participate in one or more of these programs, in this indicator please consider the actual data from any internal risk management program(s) (IEP, CAS. VDRP, ASAP, etc.), and how it relates to ongoing identification and management of risk.

Risk Score

Inspector Considerations

1–2

The air carrier participates in voluntary programs. Data derived from the air carrier’s voluntary programs indicates apparent risk is well managed by air carrier systems.

3–5

Concern exists from the data derived from the air carrier’s voluntary program(s) due to considerations such as:

·           Repeated problem area(s) and/or adverse trending, particularly as regards procedures.

·           Signs of failure in interfaces and/or controls associated with the safety data.

·           Signs of failure in process measurements and/or responsibility/authority associated with the safety data.

6–7

Concern exists because data from air carrier voluntary program(s) appear to indicate a rapid degradation of the air carrier’s critical systems, apparent air carrier failure to address the associated risks. Concern exists because air carrier’s voluntary programs data appear to indicate a negative trend with escalation in severity.

Guidance References: FAA Order 8900.1, volume 3, chapter 44, ; 8900.1, volume 6, chapter 2, section 30, Monitor Continuing Analysis and Surveillance Program/Revision; SAI 1.3.11; EPI 1.3.11; AC 120‑79 as revised; AC 120‑59 as revised; AC 120‑66 as revised; AC 00‑58 as revised.

Data Sources: Data from any internal risk management program(s) (Internal Evaluation Program, Continuing Analysis and Surveillance, Voluntary Disclosure Reporting Program, Aviation Safety Action Program, etc.)

Impact on ATOS System, Subsystem, and Elements: Associate the data that causes concern with the ATOS system, subsystem, and element model. Consider element(s) that might be affected, and/or element(s) relating to root causes.


Figure 10‑31, PH‑07 Surveillance Indicators

Condition: Surveillance data from the SPAS, PTRS, and the ATOS help to identify trends in air carrier performance and can assist with identifying risks in an air carrier’s system design.

Background: Data from ATOS‑ and SPAS/PTRS‑based surveillance provides inspectors with insight into ongoing and new areas of risk in the air carrier’s performance. Continued adverse findings should lead to corrective action, but may also indicate risk that has yet to be fully mitigated. Inspector experience and judgment concerning areas of repeated or ongoing difficulty would also bear some level of equivalency as a data source when assessing this indicator.

Risk Score

Inspector Considerations

1–2

Insufficient data exists for the purpose of this Risk Indicator as defined above to make a determination. ATOS/SPAS data trends for the air carrier do not indicate any area(s) or trend(s) of apparent risk, or indicated risk currently being mitigated by other means.

3–5

Concern exists in ATOS/SPAS data trends for considerations such as:

·           Decline in air carrier performance on reported surveillance

·           Repeated items of concern in the same area(s) in data

6–7

Concern exists because ATOS/SPAS data trends indicate major deviation from the baseline in critical operational and/or maintenance area(s). Air carrier is not addressing the associated risk(s). Concern exists because ATOS/SPAS appears trending appears to indicate a negative trend with escalation in severity. Concern exists because oversight in this area has been postponed due to lack of Federal Aviation Administration resources.

Guidance References: ATOS Automation User Guide, System Data Analysis Guide.

Data Sources: ATOS database, SPAS, inspector observation/knowledge of performance history

Impact on ATOS System, Subsystem, and Elements: For negative SPAS trends related to PTRS surveillance findings, associate the problem area(s) with the corresponding ATOS system(s), subsystem(s), element(s). For negative ATOS data trends, focus on the element(s) that are the source of the adverse trend. Consider also the related manuals, training and personnel systems that support the element(s) adversely affected.


Figure 10‑32, OS‑01 Key Management SPAS Indicators

Condition: Changes in key management personnel can significantly impact an air carrier’s system and operational stability.

Background: The SPAS management indicator incorporates the SPAS performance measures related to changes in the following key management personnel: chief executive officer, chief inspector, chief pilot, director of maintenance, director of operations, director of safety, general manager. Consider the size of the air carrier. The impact of SPAS indicators on small air carriers or a new entrant may be greater than on large, established air carriers. Key management personnel at a small air carrier may play multiple roles. High key‑management turnover could significantly impact the air carrier’s operational stability if no processes are in place to manage the change. Regardless of the number of years an air carrier has been in operation, the changes reflected in the SPAS indicators should be considered in light of their potential impact on system and operational stability.

Risk Score

Inspector Considerations

1–2

The air carrier has not experienced a change in its key management personnel. There may have been changes in key management personnel, but SPAS flag state is Expected.

3–5

Concern exists due to considerations such as:

·           SPAS flag states are at the Concern or Advisory Threshold.

·           Changes in key management personnel appear to have adversely affected the air carrier’s programs, regardless of SPAS flag state.

6–7

Concern exists because of rapid and widespread changes in key management in nearly all departments and operational areas.

Guidance References: Not Applicable.

Data Sources: SPAS, air carrier interviews, labor unions, air carrier press releases, air carrier regulatory notifications.

Impact on ATOS System, Subsystem, and Elements: When concern exists regarding change in key air carrier management positions required by regulation, focus on the corresponding and interfacing ATOS element(s) for these positions. Consider also the impact on other ATOS system(s), subsystem(s) and element(s) depending upon the position jurisdiction and safety purview related to the changed position.


Figure 10‑33, OS‑02 Financial Conditions

Condition: Air carriers that experience adverse financial conditions may have higher risk.

Background: When an air carrier experiences financial instability, the possibility for risk may increase due to a number of complex and interrelated causes, factors, and results. These may directly or indirectly impact various aspects of the air carrier’s system, subsystem, element structure, and may overlap with other risk indicators, or become a causal factor for the generation of risk in other areas. FAA Order 8300.10, Airworthiness Inspector’s Handbook volume 3, chapter 125 discusses in detail the issues of risk management for Air Carriers. Figure 125‑1 of that chapter includes a decision aid for scoring risk as it relates to financial condition. This decision aid should be utilized to arrive at periodic assessments at the air carrier. Adverse findings on the decision aid for an ATOS carrier lead to ACAT changes or the RMP, and if the inspector arrives at adverse scoring, this risk indicator should be considered, as well as other associated risk indicators that are alluded to in the scoring components for the Financial Condition Assessment Decision Aid.

Risk Score

Inspector Considerations

1–2

Financial Condition Assessment Decision Aid Score of 72–90

3–5

Financial Condition Assessment Decision Aid Score of 46–71

6–7

Financial Condition Assessment Decision Aid Score of 9–45

Guidance References: FAA Order 8900.1, volume 6, chapter 2 section 18, Evaluation of Air Carrier’s Management of Significant Changes.

Data Sources: Periodic meetings between the FAA PIs and air carrier management, conversations with knowledgeable air carrier personnel, documentation received from the air carrier or other appropriate agencies (Security and Exchange Commission, courts, banks, creditors, etc.), press, industry publications, ASIAS, etc.

Impact on ATOS System, Subsystem, and Elements: This particular risk indicator has a potentially close association with other risk indicators. When associating this indicator with the ATOS system, subsystem, element model, keep in mind that problems identified here may also have similar impact for other risk indicators as the risk indicator evaluation continues. Direct linkage might be possible between poor financial condition and the ATOS system, subsystem, element model. Consider any element(s) where budget cuts might have adverse impact.


Figure 10‑34, OS‑03 Change in Air Carrier Management

Condition: Changes in management personnel other than key management can significantly impact an air carrier’s system and operational stability.

Background: Middle management at a small air carrier may be primarily responsible for the quality of the air carrier’s systems, and any major changes could be significant. A large air carrier may have additional resources that can be relied upon when air carrier middle management personnel change. Regardless of size, the significance of the change in air carrier management should be assessed to determine the potential impact on the air carrier’s system and operational stability. The air carrier management may include personnel in the air carrier’s safety, quality assurance, engineering, operations, and maintenance departments. Changes in middle management in any of the air carrier’s major lines of business should be considered. Changes in administrative management should also be considered though they may not have the same level of impact.

Risk Score

Inspector Considerations

1–2

There have been no changes in air carrier middle management. Changes in middle management are not of high concern as there appears to be little to no impact on the air carrier’s system and operational stability. The air carrier has demonstrated appropriate and effective management of any events.

3–5

Concern exists for adverse impact at the air carrier due to change in middle management due to considerations such as:

·         Change in middle management is sudden and due to employee dissatisfaction.

·         Change in middle management does not appear to be a controlled change.

·         High change in middle management within the maintenance and/or operations organizations.

·         New or remaining staff is being retrained or cross‑trained to perform the new or expanded functions.

·         Air carrier has not experienced change in middle management in its prior history.

6–7

Concern exists for adverse impact because of rapid and widespread changes in middle management in nearly all departments and operational areas. The air carrier has not demonstrated appropriate or effective management of the event(s).

Guidance References: Not Applicable

Data Sources: Air carrier information, labor information. Consultation with the air carrier or use of industry data may be helpful in identifying such changes and assessing their impact.

Impact on ATOS System, Subsystem, and Elements: Evaluate the organizational areas that the changed middle management positions oversee. Consider the impact of such dislocations on air carrier stability and operations, and focus on the ATOS system(s), subsystem(s) and element(s) corresponding to the changed middle manager’s oversight responsibilities. Evaluate whether and how the changes might also adversely impact the training and technical administration system(s).


Figure 10‑35, OS‑04 Turnover in Personnel

Condition: A high turnover of operations or maintenance personnel can dramatically increase the potential for risk in an air carrier’s systems.

Background: Turnover in personnel may affect only the maintenance or operations organizations, or there may be a significant loss of key personnel throughout the entire organization. Maintenance personnel include staff members directly involved in ensuring the quality of the maintenance organization. Operations personnel include staff members directly involved in ensuring the quality of air carrier operations, including crewmembers (pilots and flight attendants), dispatch, and training staff.

Risk Score

Inspector Considerations

1–2

The air carrier has not experienced significant turnover in personnel. Any turnover in personnel is minor and/or the air carrier has demonstrated appropriate and effective management of any events.

3–5

Concern exists for adverse impact at the air carrier due to turnover in personnel because of such considerations including:

·         Turnover of personnel is sudden and due to employee dissatisfaction.

·         Turnover of personnel does not appear to be a controlled change.

·         High turnover in personnel within the maintenance and/or operations organizations.

·         New or remaining staff is being retrained or cross‑trained to perform the new or expanded functions.

6–7

Concern exists for adverse impact due to widespread and rapid personnel turnover in safety sensitive areas with very high impact on critical systems. The air carrier has not demonstrated appropriate or effective management of the event(s).

Guidance References: FAA Order 8900.1, volume 6, chapter 2, section 10, Operator Trip Records Inspections (PTRS Code 1628); 8900.1, volume 3, chapter 34, section 1, Air Carrier Mergers and Acquisition of Air Carrier Operational Assets.

Data Sources: Direct queries of company management, union announcements, company newsletters, bulletins, etc.

Impact on ATOS System, Subsystem, and Elements: Evaluate the areas of the company’s operation that is losing personnel. Match the diminished job function(s) to the ATOS system, subsystem, and element model to assess impact. Consider also the impact on training, manual, and technical administration systems. If inspector concern is centered upon retraining, or cross training of employees, focus on the personnel training and qualification system.


Figure 10‑36, OS‑05 Reduction in Workforce

Condition: A reduction in the air carrier’s workforce can dramatically increase the potential for failure in an air carrier’s systems.

Background: Workforce reductions, layoffs, or buyouts may or may not have an impact on safety and the potential for noncompliance; it depends on how and why they occur, and who is involved.

Risk Score

Inspector Considerations

1–2

The air carrier has not experienced reduction in workforce. There has been little reduction in workforce and/or the air carrier has demonstrated appropriate and effective management of any events.

3–5

Concern exists for adverse impact at the air carrier due to reduction in workforce due to considerations such as:

·         Reduction of personnel does not appear to be a controlled change, pace and rate of reduction is abrupt, haphazard, uncoordinated, or occurring over very short timeframe.

·         Reduction in personnel within the maintenance and/or operations organizations.

·         Reduction affected most experienced personnel and/or of quality, safety or training personnel.

·         Air carrier has not experienced a comparable reduction in workforce in its prior history.

6–7

Concern exists for adverse impact at the air carrier due to widespread and rapid personnel reduction in safety sensitive areas with very high impact on critical systems. The air carrier has not demonstrated appropriate or effective management of the event(s).

Guidance References: FAA Order 8900.1, volume 6, chapter 2, section 18, Evaluation of Air Carrier’s Management of Significant Changes

Data Sources: Direct queries of company management, union announcements, company newsletters, bulletins, etc.

Impact on ATOS System, Subsystem, and Elements: Evaluate the areas of the company’s operation that is losing personnel. Match the diminished job function(s) to the ATOS system, subsystem, and element model to assess impact. Consider also the impact on training, manual, and technical administration systems.


Figure 10‑37, OS‑06 Rapid Growth/Downsizing

Condition: Times of significant change such as rapid expansion or downsizing can impact air carrier operations due to the possible misalignment of resources and operational requirements.

Background: Growth may be quite apparent in the addition of new aircraft, routes, and employees. It may also be less apparent; growth may be in the form of the addition of new programs or business practices. (For example: the addition of a repair station certificate, or an increase in aircraft utilization.) Downsizing may also be apparent in the reduction of aircraft, routes, and employees or in less apparent operational areas. It is also possible for an air carrier to simultaneously experience rapid change in the areas of growth and downsizing. (For example: an air carrier’s business plan could include a dramatic reduction in workforce with a simultaneous expansion in routes.) If organizational structures and support resources do not keep pace with the tempo of operations and the changes, safety problems can occur.

Risk Score

Inspector Considerations

1–2

The air carrier is not presently experiencing rapid growth or downsizing. Rapid growth or downsizing appears to be effectively managed by the air carrier’s procedures, controls and process measures. Any adverse impact appears to be negligible to nonexistent.

—and/or—

Rapid Growth/Downsizing Assessment Decision Aid score between 43 and 60.

3–5

Air carrier may not have properly implemented its rapid growth/downsizing plan. Concern exists due to considerations such as: Delay, cancellation and reliability rates, MELs carried daily, Turn times between flights and/or ground time for maintenance, Crew rest requirements, Misallocation of resources, Adequacy of training and training department personnel

—and/or—

Rapid Growth/Downsizing Assessment Decision Aid score between 25 and 42 related to growth, downsizing or simultaneous combination.

6–7

Concern exists since the air carrier is not using any projected business plan to govern the growth or downsizing

—and/or—

Rapid Growth/Downsizing Assessment Decision Aid score between 6 and 24 related to growth, downsizing or simultaneous combination.

Guidance References: FAA Order 8900.1, volume 6, chapter 2, section18, Evaluation of Air Carrier’s Management of Significant Changes.

Data Sources: Air carrier programs, OpSpecs, discussion with air carrier personnel, performance history of risk management programs, DCTs.

Impact on ATOS System, Subsystem, and Elements: Since rapid expansion, growth, or downsizing virtually affects the whole operation of an air carrier, it can be difficult to pinpoint specific areas in the operation that may present a risk. If the Rapid Growth/Downsizing Assessment Decision Aid [FAA Order 8900.1,volume 6, chapter 2, section 18 Evaluation of Air Carrier’s Management of Significant Changes, Figure 6-30] was used, focus on the word pictures where the individual scoring was low and consider any corresponding ATOS element(s).


Figure 10‑38,OS‑07 Merger or Takeover

Condition: Air carriers must effectively manage mergers or takeovers to ensure continued compliance and safe operating practices.

Background: A merger or takeover may include a combination of divergent corporate and organizational structures and safety cultures. Some merger or takeover transactions may simply be a name change, or may occur at a level that does not alter or impact safety sensitive operations. In these cases, the impact on system or operational stability may be minimal.

Risk Score

Inspector Considerations

1–2

The air carrier has not experienced a merger or takeover. The air carrier appears to be effectively managing the risk associated with the merger or takeover. Adherence to new processes, procedures and programs appears to be adequate. Any adverse impact is negligible to non‑existent, or unrelated to safety sensitive issues.

3–5

Concern exists for adverse impact at the air carrier due to considerations such as:

·         The impact of a merger or takeover may not be effectively managed by the air carrier’s procedures, controls and process measures.

·         The air carrier key personnel and others have little to moderate experience with the new type and complexity of the operation.

·         Evidence of inadequate interface between processes, procedures, and programs across departments.

·         Evidence of inadequacies in adherence to new processes, procedures and programs.

6–7

Concern exists for adverse impact because the air carrier has no transition plan in place for accomplishing the merger or acquisition. The air carrier’s key personnel and others have no experience with the new type and complexity of the operation. The air carrier has not demonstrated appropriate or effective management of the event(s).

Guidance References: FAA Order 8900.1, volume 5, chapter 3, section1, Application Phase¾ATP Applicants Engaged in Operations Under Title 14 CFR; 8900.1, volume 3, chapter 34, section 2, Major Changes in Operating Authority ; 8900.1, volume 2, chapter 1, section2, Assignment of Federal Aviation Administration (FAA) Responsibilities.

Data Sources: Office of the Secretary of Transportation, media, air carrier personnel, labor unions, DCTs, SPAS.

Impact on ATOS System, Subsystem, and Elements: Look at system(s), subsystem(s) and element(s) associated with the risk by evaluating where the largest changes/program differences or most serious lack of interface could exist. Consider focusing also on the ATOS elements dealing with training and technical administration/management support for these areas of concern.


Figure 10-39, OS‑08 Labor–Management Relations

Condition: A poor or deteriorating labor–management relationship can create risk.

Background: Good labor–management relations are critical to the system and operational stability of the air carrier. A threatened or actual shutdown in operations can have an adverse economic impact on an air carrier as well as greatly affect the stability of an air carrier’s systems. Areas to consider include the status of the bargaining agreements between air carrier labor and management, job function(s) of employee groups in adversarial relationship with management and the potential effect(s) on the air carrier’s systems, subsystems, and elements.

Risk Score

Inspector Considerations

1–2

The air carrier appears to have no adverse labor–management relations issues at this time. The air carrier has demonstrated appropriate and effective management of any labor‑management issues or events.

3–5

There appear to be difficulties in the air carrier’s labor–management relations.

Concern exists due to considerations such as:

·         Informational picketing is being conducted by employees against the air carrier.

·         Newspaper advertisements, billboards, or other methods that describe lack of contract negotiations or bargaining has been purchased by either the employee groups or management of the air carrier.

·         Cross‑utilization of employees in safety sensitive areas due to the unrest (e.g., management working line functions in addition to their normal managerial responsibilities).

·         Requests for employee concessions by management and/or lower than industry average compensation and benefits.

6–7

Concern exists regarding direct adverse impact of poor labor–management relations on safety. Employee group(s) actively conducting a work stoppage due to labor unrest at the air carrier. The air carrier has not demonstrated appropriate or effective management of the event(s).

Guidance References: FAA Order 8900.1, volume 7, chapter 1, section 1, Accident Investigations, 8900.1, volume 8, chapter 5, section 6, Process Service Difficulty Report; 8900.1, volume 7, chapter 1, sections 1, Accident Investigations, and section 2, Incident Investigations and Occurrences.

Data Sources: SPAS Data packages, FAA Accident Investigation Records, Investigation of Pilot Deviation Reports, Accident/Incident Corrective Action Records, Aircraft Accident/Incident Preliminary Notices, NTSB findings/reports, MIS reports, Service Difficulty Report, DCT 1.2.4 “Mechanical Interruption Summary Reports”

Impact on ATOS System, Subsystem, and Elements: If root cause(s) and/or failure of event(s) is generally known and validated, consider the system(s), subsystem(s) and element(s) in the ATOS structure that relate to the failure or root cause. Consider also the ATOS Training and Technical Administration systems. If the air carrier has not effectively and appropriately managed the events, consider also the ATOS Manuals and Technical Administration Systems.


Figure 10‑40, CD‑01 New/Major Changes to Program

Condition: Safety issues may develop from new or changed programs and may increase the potential for noncompliance with existing processes and controls.

Background: New or changed programs at an air carrier should be assessed to determine if and how they affect the air carrier’s operations, training and maintenance system. Some program changes are significant enough to require an OpSpecs amendment, such as Extended Twin‑Engine Operations (Extended Operations), but some new or changed programs are less apparent. For example, a sales and marketing initiative adding additional destinations might drive a program change to increase fleet utilization, and cause a reduction in ground time and aircraft servicing, which may induce safety‑related risks. Any new program or program change that affects the air carrier’s systems could have a significant impact on the air carrier’s safety profile.

Risk Score

Inspector Considerations

1–2

There are no new or major changes to air carrier programs. There are well‑established and maintained system controls, with fully documented procedures, which have allowed the air carrier to absorb new programs or program changes without affecting quality or safety. The air carrier has demonstrated appropriate and effective management of the situation.

3–5

Concern exists regarding adverse impact of new/major changes at an air carrier due to considerations such as: major changes to programs are inadequately described and documented; major changes are motivated by cost cutting; the affected department(s) system control(s) do not have sufficient degree of strength and comprehensiveness to support the new/changed program; air carrier’s staff size and capabilities do not meet the requirements of the changed program and/or are not sufficiently trained, and/or Air carrier past performance history with new/changed programs.

6–7

Concern exists because air carrier infrastructure supporting the safety related aspects of the new or changed program does not exist at all. The controls in place no longer support the changed program or no controls exist for a new program. The air carrier has not demonstrated appropriate or effective management of the event(s) and concern exists for extremely adverse impact.

Guidance References: Not applicable.

Data Sources: Notifications by carrier, OpSpecs revisions, validation flight results, training program revisions, air carrier performance history.

Impact on ATOS System, Subsystem, and Elements: Identify the operational area where the change has taken place. In some cases, the new/changed program will correspond exactly with an Air Transportation Oversight System (ATOS) element. Some examples include ETOPS, Reduced Vertical Separation Minimums authorizations, Lower Landing Minimums, exit seating. Remember to also consider the interfacing ATOS systems, such as the Training, Manual, and Technical Administration systems. In other cases, the new/major change to program will not have a direct parallel with a specific element. In these cases, focus on the operational context of the change, and look for the element(s) that would most likely be impacted. Also consider supporting Manual, Training, and Technical Administration systems.


Figure 10‑41, CD‑02 Continuing Analysis and Surveillance (CAS) System (Airworthiness only)

Condition: Air carriers with a poorly functioning CAS System can overlook and improperly manage increased levels of risk.

Background: A CAS system provides the air carrier with an internal diagnostic and evaluation tool (audit and surveillance) for continuously monitoring and correcting deficiencies in its maintenance program through a system of ongoing data collection, data analysis, and trend reporting. When implemented and maintained within an environment that includes clear definition of responsibilities; process independence; management commitment; continuity; scheduled evaluation; corrective action and followup; and clear, concise, and available documentation; CAS system can provide the air carrier with one critical means of ensuring management control over the maintenance organization. The CAS system is an integral piece of an air carrier’s comprehensive maintenance and inspection program. As such, it requires consistent oversight not only by the carrier’s responsible personnel but also by the principal inspectors. It is incumbent on the CMT to monitor this program and identify and request corrective action of any shortcomings in the CAS system.

Risk Score

Inspector Considerations

1–2

The air carrier appears to have an effective CAS system in place and is utilizing the data generated therein to continuously monitor and correct deficiencies in its maintenance program. The air carrier has demonstrated appropriate and effective management of any events.

3–5

Concern exists regarding the air carrier’s ability to manage a comprehensive CAS system due to considerations such as:

·         Recurring findings after the issues has been mitigated by CAS.

·         CAS corrective actions result in new problem(s).

·         Recurring CAS findings are associated with the same ATOS subsystem(s).

·         CAS results are inconsistent with other outside audit results, such as those conducted by FAA and DOD.

·         There is no internal audit system of the CAS, e.g., Safety Audit Program.

·         Air carrier has no CAS performance history.

6–7

Concern exists that the carrier does not have a functioning CAS in place as indicated by IEP, safety program, or other equivalent program(s).The air carrier has not demonstrated appropriate or effective management of the situation.

Guidance References: FAA Order 8900.1, volume 3, chapter 44; 8900.1, volume 6, chapter 2, section 30, Monitor Continuing Analysis and Surveillance Program/Revision; AC 120‑16 as revised; AC 120‑59 as revised, AC 120‑79 as revised, Supplemental Information and DCT Content SAI 1.3.11 and EPI 1.3.11.

Data Sources: Periodic reliability reports, monthly CAS meetings, surveillance reports from both SPAS and the ATOS database, DOD findings, air carrier past performance including element 1.3.11.

Impact on ATOS system, Subsystem, and Elements: In addition to the actual ATOS element dedicated to CAS (1.3.11), consider how the deficiencies identified in the CAS may impact other ATOS system(s), subsystem(s) and element(s).


Figure 10‑42, CD‑03 Safety Management

Condition: Air carriers who do not have a safety management system may not understand or adequately control hazards to operational safety.

Background: A safety management system is essentially a business management approach to controlling risk. A risk management system provides the company’s management with a detailed roadmap for monitoring safety‑related processes. Risk management systems may incorporate one or more voluntary programs such as an Aviation Safety Action Program (ASAP), and Internal Evaluation Program (IEP). Both of these programs have a strong relationship to the functions of safety assurance and safety promotion. Air carriers are encouraged to consider integrating these programs into their comprehensive approach to safety management. Safety management concentrates more on the control of processes rather than efforts targeted toward extensive inspection and remedial actions on end products. A good system will include safety quality policy, safety risk management, safety assurance, and safety promotion.

Risk Score

Inspector Considerations

1–2

The air carrier appears to have an effective Safety Management system in place and is utilizing the data generated therein to continuously monitor and correct deficiencies in its critical programs to adapt to the carriers’ changing environment. The air carrier has demonstrated appropriate and effective management of the situation.

3–5

Concern exists regarding the air carrier’s ability to manage a risk due to considerations such as:

·         The air carrier does not have policy and procedures that defines a method by which risk is managed

·         The air carrier does not have a risk management process to monitor the critical programs

·         The carrier does not have a safety assurance process; this is equivalent to an Internal Evaluation Program.

·         The carrier does not promote safety—safety culture must allow for communication and a means for employees to report safety deficiencies without fear of reprisal

6–7

Concern exists that the carrier does not have a safety management system or an equivalent method to assess and mitigate safety risk. It does not have an effective method of monitoring its safety critical programs to ensure it is adapting to the carrier’s changing environment.

Guidance References: 8900.1, volume 3, chapter 44, 8900.1, volume 6, chapter 2, section 30, Monitor Continuing Analysis and Surveillance Program/Revision; SAI 1.3.11; EPI 1.3.11., AC 120‑79 as revised, AC 120‑59 as revised, AC 120‑66 as revised. AC 00‑58 as revised.

Data Sources: Periodic Reliability reports, monthly CAS meetings, surveillance reports from SPAS and the ATOS database, DOD Audit findings, Shared information and data collected from voluntary disclosure reporting programs.

Impact on ATOS System, Subsystem, and Elements: If weakness in a critical program has been identified, concentrate on the area(s) the critical program manages. Relate those area(s) to corresponding ATOS system(s), subsystem(s), element(s). Consider also focusing on Manuals and Key personnel subsystems.


Figure 10‑43, CD‑04 Relationship with the FAA

Condition: The air carrier’s relationship with its assigned FAA personnel may provide insights into the air carrier’s compliance posture and safety culture.

Background: Strong communication, a high level of trust, and a good working relationship between air carrier personnel and FAA personnel assigned to monitor the air carrier can have a positive impact on quality and safety. Conversely, a weak communications infrastructure and a lack of trust between parties can have a negative impact on air carrier operations, quality, and safety. This, in turn, can affect the stability of the air carrier’s systems, and may be an indication of risk.

Risk Score

Inspector Considerations

1–2

The air carrier appears to have a cooperative relationship with assigned FAA personnel.

3–5

Concern exists regarding the air carrier’s relationship with the FAA due to considerations such as:

·         Little or no history of strong two‑way communication between air carrier and assigned FAA personnel

·         Recent indications of unwillingness to share data and findings with assigned FAA personnel on the part of the air carrier

·         FAA recommendations and suggestions not welcomed by the air carrier

6–7

Concern exists due to apparent failure of cooperation with assigned FAA personnel in critical air carrier areas. The air carrier has not demonstrated appropriate or effective management of this situation.

Guidance References FAA Order 8900.1, volume 3, chapter 3,

Data Sources: FAA personnel assigned to the air carrier, meetings with air carrier personnel, written correspondence between air carrier and CHDO.

Impact on ATOS system, Subsystem, and Elements: Consider which subsystem(s) and element(s) are directly affected by the perceived lack of cooperation and/or communication. (For example, if the company department/personnel overseeing flight attendants are not cooperative with the FAA, consider which ATOS system(s), subsystem(s), element(s) may have associated adverse impact and risk due to that lack of cooperation.).


Figure 10‑44, CD‑05 Human Factors

Condition: Risk may exist due to human lapses in the air carrier’s design and/or performance.

Background: Human factors is an umbrella term for the myriad effects of human interaction with a system. When humans act within established parameters, the system functions as designed. When humans deviate from established parameters, the system is degraded and moves into areas of unknown levels of risk and accidents and incidents occur. To achieve the level of safety desirable in high-consequence operations such as airlines, an organization’s systems must function in a variety of circumstances without degradation. Standard operating procedures and controls can be introduced into the system to lessen the potential for human variability or increase the system’s tolerance for human variability. A certificate holder’s failure to develop and implement effective standard operating procedures and controls could increase the level of risk in system and processes.

Risk Score

Inspector Considerations

1–2

There were no observed human factors lapses at the air carrier. Any human factors lapses were minor. The air carrier has demonstrated appropriate and effective management of any events.

3–5

Concern exists that the air carrier has inadequate human factors design or performance due to considerations such as:

·         Human factors not integrated into air carrier’s training program.

·         Human factors not integrated into air carrier’s safety systems.

·         Air carrier process to ascertain root cause of human factors problems is ineffective.

·         Air carrier system controls are ineffective.

·         Substantial number and/or type of human factors performance errors have occurred at the air carrier.

6–7

Concern exists due to repeated and critical human factors performance errors that remain uncorrected and appear to indicate a negative trend with escalation in severity. The air carrier has not demonstrated appropriate or effective management of the event(s).

Guidance References: FAA Order 8900.1, volume 11, chapter 4, section 1; FAA Human Factors Policy, FAA Order 9550.8, appendix 1 and appendix 2; Human Factors Guide for Aviation Maintenance and Inspection 1998 version 3.0; FAA/AAM Human Factors in Aviation Maintenance and Inspection Research Phase Reports (1991–1999); Human Factors Issues in Aircraft Maintenance and Inspection Meeting Proceedings (1989–1998).

Data Sources: ATOS Data Collection Tools, particularly SAI Controls Section and EPI Performance Measures; SPAS and PTRS data; ASIAS database; NTSB database

Impact on ATOS System, Subsystem, and Elements: Human factors issues could potentially impact every air carrier system, as all systems involve human input and interaction with the system. Lapses in human factors design may exist in Manual Management, Maintenance Organization, and Training Program. Consider the location of the concern. Focus on elements with past negative findings in SAI controls attribute and/or EPI performance measures. Consider if training and technical administration design or performance also contributes to the risk.

RESERVED. Paragraphs 10‑127 through 10‑141


Volume 10 Air Transportation Oversight System

CHAPTER 2 Procedures for Design and Performance Assessment

Section 3  Design and Performance Assessment Resource Management

Figure 10‑45, Module 3. Resource Management

Module 3. Resource Management flowchart

10-142 Resource Management. Resource Management is an ongoing process to ensure that available resources are assigned to the highest risk priorities identified in the Comprehensive Assessment Plan (CAP) for continuing operational safety.

A.     By comparing the prioritized CAP and available resources, managers ensure that available resources are assigned to tasks with the highest safety priority for a given quarter.

B.     Funding to complete the job is allocated at the same time the individual is assigned. If resources are not available, the manager leaves the work unassigned and documents the reasons why. The principal inspector (PI) or certification project manager (CPM) is notified that the work was not assigned. When insufficient resources are available to complete all the work, the frontline manager uses the CAP to establish priority when making assignments.

C.     Prioritizing and assigning resources based on risk is a critical aspect of the Air Transportation Oversight System (ATOS). CAPs are created independent of resources. Quarterly work programs consist of design assessments and performance assessments that are assigned to inspectors. Unassigned design and performance assessments eventually are documented as work not accomplished because resources are not available.

10-143 Is the Appropriate Resource Available? (See flowchart process step 3.1.) The frontline manager evaluates the CAP against the roster of the Certificate Management Team (CMT) or the Certification Project Team (CPT) to determine whether the appropriate resources are available to accomplish the assessment activities. The frontline manager considers scheduled leave, scheduled training, training requirements, and other potential constraints. For a certification project, the certificate‑holding district office (CHDO) and the regional office determines adequate resource availability during the initial evaluation per the Certification Services Oversight Process for Original Organizational Certifications document. The manager should consider the availability of the AFS‑900 ATOS Certificate Management Office (CMO) certification section support, and regional and national specialist availability (e.g., resource pilot support).

A.     Roster Maintenance. The manager ensures that the roster accurately reflects CMT or CPT membership as active qualified, active nonqualified, or inactive.

1)      Active, qualified members are assigned to the CMT or CPT and meet the baseline training requirements for their assigned position. (See paragraph 3.)
2)      Active, nonqualified members are assigned to the CMT or CPT, but have not completed baseline training requirements. (See paragraph 3.)
3)      Inactive members are no longer assigned or available to the CMT or CPT.

B.     Certificate Management Team Staffing. A dedicated CMT is assigned oversight responsibility for each air carrier. The CMT develops and executes a CAP that is tailored to that air carrier. CMT staffing may include:

1)      Certificate Management Team Manager. The CMT manager is the office, section, or unit manager who is assigned overall responsibility for air carrier certificate management. The CMT manager is an advocate for ATOS policies, processes, and their integration into the business strategies and operations of the office. The manager ensures that inspector resources are assigned to the highest safety priorities for a given quarter.
2)      Frontline Manager(s). Frontline managers directly supervise, assign, and review the work of CMT members.
3)      Principal Operations Inspector, Principal Maintenance Inspector, and Principal Avionics Inspector. A PI should not be assigned to more one Title 14 of the Code of Federal Regulations part 121 air carrier CMT. Each PI may have one or more assistant PIs.
4)      Data Evaluation Program Manager. A data evaluation program manager (DEPM) may be assigned to the CMT. In the absence of a DEPM, frontline managers serve as data reviewers. The DEPM reports to a frontline manager above the PI. The DEPM must be qualified as an air carrier inspector. A DEPM should be assigned to no more than four CMTs. Shared DEPMs report to only one CMT manager and only one frontline manager, as determined by their regional office.
5)      Aviation Safety Inspectors. All aviation safety inspectors (ASI) assigned to the air carrier certificate are members of the CMT. ASIs are generally located at the CHDO or CMO but can be shared among more than one CMT. Assigned ASIs can include those from the following areas of expertise: flight operations, maintenance, avionics, cabin safety, and dispatch. At least one Cabin Safety inspector (CSI) is assigned to each CMT with oversight responsibility of air carriers involved in passenger carriage. If the air carrier is an all‑cargo operation, the CMT must consider cabin safety issues if the air carrier has provisions or procedures for supernumerary personnel. Dispatch ASIs (DSI) may be shared resources and priority assignment should be considered to support CMT’s with oversight responsibility of air carriers involved in passenger carriage.
a)      Shared ASIs. Shared ASIs may be approved by a division when only one certificate‑holding region (and ideally when only one CHDO) is involved. An ASI that is a shared resource should not be assigned to more than four CMTs. Shared ASIs report to only one CMT manager and only one frontline manager, as determined by their regional office.
b)      Requirements for Remotely Sited Positions. Under certain circumstances, ASIs may be based in a location other than the CHDO or CMO. Regional division managers are responsible for establishing and approving remotely sited positions.

1.      These positions are only established for situations where the air carrier has very large, noncontract training or maintenance centers located far from the CHDO.

2.      A remotely sited position also may be necessary with the expectation of an ongoing full person‑year of data collection for design and performance assessments associated with the CAP.

3.      As the focus of ATOS is on systems‑based assessments rather than event‑ or activity‑based assessments, air carrier hubs and employee domiciles are not the sole consideration in this determination.

6)      Operations Research Analyst. An operations research analyst (ORA) is assigned to each CMT. Regional or national analysts may provide analytical support.
7)      Aviation Safety Technicians and Aviation Safety Assistants. If aviation safety technicians and aviation safety assistants are assigned to the air carrier certificate, then they are members of the CMT.

C.     Certification Project Team Staffing. A CPT is assigned to each initial certification project prior to the applicant initiating formal application. The CPT develops and executes a CAP that is tailored to that applicant.

1)      General Certification Project Team Requirements. For air carriers certificated to operate under part 121, existing part 121 principal inspectors are not used for new certification activities. Other ASIs currently assigned to a part 121 CMT may be used for the new certification activities only to the extent that existing operator oversight is not compromised. Available staffing for post certification should exist or be reasonably projected to be available through reassignments or merit promotion selections. The staffing should be comprised of dedicated PIs that are not assigned other complexity to 14 CFR part 121 CMTs. Only inspectors in an air carrier position description are used for air carriers certificated for part 121 operations.
2)      CPT members include:
a)      Certification Project Manager. The CHDO manager designates one member of the certification team to serve as the CPM. The person designated as CPM should have completed the baseline training and should have previous experience in certifying an air carrier under part 121. It is desirable that a person with PI experience be designated as the CPM.
b)      Certification Team Leader. The AFS‑900 ATOS CMO Certification Section assigns a certification team leader and team members to each certification project. This person works with the CPM to communicate and coordinate all certification team activities and ensure the CPD is followed.
c)      Certification Team Members. The certification team should consist of at least an Operations inspector, a Maintenance inspector, and an Avionics inspector. At least one CSI is assigned to each certification project involving passenger carriage. If the certification is for a cargo‑only operation, the certification team must consider cabin safety issues if the applicant has provisions or procedures for supernumerary personnel. Utilizing a Dispatch inspector is recommended. For each proposed aircraft type, there should be a qualified Operations inspector assigned to the team.

10-144 Baseline Training. An inspector may be assigned to a CMT or CPT before receiving baseline training, but inspectors cannot be assigned a Safety Attribute Inspection (SAI), Element Performance Inspection (EPI), or Constructed Dynamic Observation Report (ConDOR) until they have received the baseline training. Frontline managers verify that their assigned inspectors have completed required training, including the Air Carrier‑Specific Familiarization Briefing, before assigning them to SAIs, EPIs, and ConDORs. The CMT or CPT roster should always be updated when an inspector completes baseline training. The baseline training for all inspectors assigned to a CMT or a CPT includes:

·        All courses of all phases of the initial or transition air carrier training string for the inspector’s specialty;

·        System Safety course;

·        Overview of ATOS and system safety;

·        ATOS orientation training course;

·        ATOS automation applications training course; and

·        Initial and recurrent Air Carrier‑Specific Familiarization Briefing (CMT members only).

10-145 Air Carrier‑Specific Familiarization Briefing. ASIs are provided the Air Carrier‑Specific Familiarization Briefing upon initial assignment to an ATOS CMT. (See Figure 10‑45 for recommended topics.) Inspectors who are assigned to a CPT, or who were assigned to the air carrier when it transitioned to ATOS are considered to have already received the required initial Air Carrier‑Specific Familiarization Briefing during the certification or transition processes. At the annual planning meeting inspectors receive recurrent Air Carrier‑Specific Familiarization Briefings in applicable subjects to refresh their knowledge and be notified of any significant changes in the air carrier’s operations. CMTs use the following policies and procedures to plan, conduct, and document initial and recurrent Air Carrier‑Specific Familiarization Briefings.

A.     Applicability. Inspectors assigned to CMTs receive briefings in the general topics and subjects that are specific for their specialty. DEPMs receive briefings in the general topics and subjects specific to operations, cabin safety, maintenance, and avionics.

B.     Methodologies. The air carrier‑specific outline of subjects may be presented by a combination of lectures, site visits, and directed self‑study. The briefings may be conducted one on one or for a group of new CMT members at the option of the manager. The directed self‑study should be completed during normal working hours and should not be used for more than 50 percent of the recommended programmed hour requirements.

C.     Recommended Curriculum. A standard curriculum is contained in the Air Carrier‑Specific Familiarization Briefing outline located in Figure 10‑45 at the end of this section. The CMT manager determines which subjects are applicable to the air carrier’s operations, and determines the amount of lecture and self‑study hours.

D.    Briefing Presenters. Inspectors assigned to the CMT with expertise in the covered subject will conduct lecture portions of the Air Carrier‑Specific Familiarization Briefings. Federal Aviation Administration (FAA) Briefing and Presentation Techniques correspondence course (catalog number 14010) should be used by presenters who do not have prior experience as instructors.

E.     Assessment. An open‑book, oral, or written quiz determines satisfactory completion of the briefings.

F.      Recordkeeping. Each CMT will maintain a copy of its Air Carrier‑Specific Familiarization Briefing outline and any self‑study materials. The CMT documents successful completion of the initial Air Carrier‑Specific Familiarization Briefing for each CMT member.

G.    Funding. Each CMT is responsible for the costs associated with completing the Air Carrier‑Specific Familiarization Briefings.

10-146 OTHER TRAINING.

A.     All CMT Operations inspectors are programmed to receive initial training in an aircraft type operated by their assigned air carrier. CMT Operations inspectors may be programmed to receive recurrent training as required by their assigned responsibilities.

B.     All CMT Airworthiness inspectors are programmed to receive initial systems training appropriate to their avionics or maintenance specialty in an aircraft type operated by their assigned carrier.

C.     Inspectors assigned to a CPT receive briefings on the Certification Process Document.

D.    ORAs receive the following training, as required: indoctrination, Safety Performance Analysis System, ATOS baseline training, and training for data‑rich carrier programs as needed (e.g., Advanced Qualification Program, Aviation Safety/Accident Prevention, Maintenance Reliability).

10-147 Sharing of Resources. For decisions regarding resource management of the CMT positions above, the Shared Resources Decision Aid can assist CHDO management. This decision aid is located on the ATOS intranet site. Aviation Safety inspectors may be assigned to no more than four CMTs. Employees report to only one frontline manager. Resources sharing should only occur within a single region and ideally when only one CHDO is involved.

Note:   Sharing of resources should be accomplished based on national guidance and directives.

10-148 Assign Individual and Allocate Funding. (See flowchart process step 3.2.) When the appropriate resource is available based on staffing, training, and funding, the frontline manager assigns the inspector to the appropriate work assignment and allocates funding.

A.     The frontline manager assigns and utilizes resources in accordance with the prioritization identified by the PI or CPM in the CAP.

B.     Frontline managers should also consider relevant certificate factors when making work assignments, particularly when CMTs share resources. Factors to consider when comparing work requests from two or more CMTs include:

·        Enplanements and departures,

·        Length of time the carrier has been certificated,

·        Fleet size, type, and age,

·        Utilization rate,

·        Route structure (number of stations, number of FAA regions),

·        Type of operation (Effect on flying public),

·        Number of approved programs (complexity),

·        Maintenance contracts,

·        Training contracts,

·        Crew domiciles,

·        Multiple certificate management responsibilities of principals, and

·        Wet and dry lease.

C.     Other Considerations for Assigning Work to CMT or CPT Inspectors.

1)      The CAP is the only part 121 assessment work program assigned. In addition to data collection activities for the assigned CMTs, inspectors may also be assigned work in accordance with FAA Order 8900.1, Flight Standards Geographic Program.
2)      The frontline manager can redirect work assignments from one CMT or CPT member to another.

10-149 Assignments for Design or Performance Assessment.

A.     PI Instructions. PIs should provide detailed instructions to assist the manager or frontline manager in identifying appropriate individuals to assign to SAIs and EPIs. The manager or frontline manager should consider inspector training, experience, qualifications, geographic location, availability, and workload.

B.     DCT‑Specific Instructions. Some DCTs may contain specific instructions for additional training, experience, or qualifications that may be helpful in determining inspector assignments. Specific instructions may also include additional references, background information, manuals, or other system document that should be reviewed, as well as suggestions for specific types of activities and/or reporting instructions.

C.     Inspector Assignments Can Be Changed Anytime Prior to Actual Work Activity Being Started. Assignment changes may include switching from unassigned to assigned or vice versa, and reassigning an assessment from one inspector to another. Once work has begun on an EPI or SAI, inspector assignments cannot be changed.

10-150 Considerations Specific to Assigning an SAI. The frontline manager assigns SAI team coordinators (TC) and SAI team members. The frontline manager may assign an SAI to a single inspector. In that case, the inspector is also the TC. To help the frontline manager identify appropriate individuals to assign to SAI teams, PIs or CPMs should provide detailed instructions. The frontline manager should consider inspector training, experience, qualifications, geographic location, availability, and workload.

A.     The SAI Team Coordinator. The SAI TC organizes and coordinates SAI team activities. The TC ensures that activities, such as air carrier personnel interviews, are not redundant and that team members complete all activities to accurately answer the questions on the SAI. The TC is a leadership role that should be assigned to an experienced inspector with a solid knowledge of the air carrier. The TC should be based near the location where most SAI activities will take place.

B.     SAI Team Members. Inspectors who have varied backgrounds and experience, and are from different geographic locations can comprise a team. SAI teams should always contain inspectors with a sufficient knowledge base to accurately assess the element. The inspector(s) designated to complete the SAI should be appropriately trained and knowledgeable on subjects related to the element.

10-151 Document Reasons Why Work Was Not Assigned. (See flowchart process step 3.3.) The frontline manager assigns work based on the CAP priorities for a given quarter until no resources remain. If appropriate resources are not available to complete the entire CAP, the frontline manager documents why the remaining work is unassigned. This ensures work that remains unassigned is documented for evaluation in a future planning cycle.

10-152 Notify PI or CPM That Work Remains Unassigned. (See flowchart process step 3.4.) The frontline manager notifies the PI or CPM of any work that remains unassigned. If necessary, the PI or CPM can initiate a Risk Management Process to address any air carrier or FAA risk(s) associated with the unassigned work.

10-153 Review Comprehensive Assessment Plan. (See flowchart process step 3.5.) Once the CAP is developed, the data collection requirements are documented using detailed work instructions, and all of the data collection activities are either assigned or identified as unassigned, the CMT or CPT manager reviews the plan.

A.     The review ensures that the CAP is risk‑based and that the work is assigned according to priorities. In the review of the design or performance assessment plan, the CMT or CPT manager must ensure that the elements are risk‑prioritized with the proper justification.

B.     A CMT or CPT manager that does not concur with the oversight requirements, priorities, or resource decisions should discuss the issue with the PI and the frontline managers. The plan may be adjusted, as required, by the PI. The PI can enter a comment in the plan that explains the reason for an adjustment.

10-154 Ongoing Resource Management. Resource management is a continual task for the CMT or CPT. Frontline managers should continue to evaluate resources for work plans, and consider the needs of special data collection and assessment activities, such as ConDORs and risk management action plans.

10-155 Incomplete Inspection Records Resulting from an Inspector Leaving the CMT or CPT. Frontline managers ensure that when an inspector leaves the CMT or CPT all inspection records are finalized before the inspector’s departure. If the inspector cannot complete the work in progress and leaves the CMT or CPT, the PI or CPM notifies the frontline manager and initiates the removal process for an incomplete record.


Figure 10‑46, Air Carrier‑Specific Familiarization Briefing Outline of Subjects

General Topics—All Specialties

(Recommended Minimum Hours—8)

 

1. OVERVIEW OF AIR CARRIER

a. Brief History

(1)Mergers

(2) Acquisitions

(3) Financial status (i.e., bankruptcies)

(4) Compliance attitude

(5) Corporate headquarters location

(6) Main base location

(7) Corporate philosophy

b. Air Carrier Demographics

(1) Key personnel (names/phone numbers)

(2) Organization chart

(3) Major programs

(4) Location of hubs

(5) Location of training bases

(6) Location of maintenance facilities

(7) Personnel strengths

(8) Agent for service

(9) Communications

(10) Special operations

(11) Fleet demographics

(12) Aircraft numbering system

c. Areas of Operations

(1) Type/fleet type of activity

(2) Concentrations of activity

d. Code Sharing/Wet Lease/Interchange

(1) Airline participants

(2) Foreign flight attendant supernumeraries

e. Future Plans of the Air Carrier

 

b. Policies and Procedures for CMT

Responsibility for coverage of incidents and occurrences

c. Individual Interests/Specialties

Type ratings, areas of interest, background and experience

d. Communications

(1) Types of information to be requested directly from air carrier (points of contact)

(2) Information available from the CMO

(3) Points of contact and protocol

 

3. BACKGROUND OF CAP

a. Special Emphasis Areas

(1) Results of Air Carrier Assessment Tool (ACAT)

(2) New and pending issues

 

4. COMPANY MANUALS

a. Overview of Air Carrier Manual System

(1) Manual numbering

(2) Master listing of all parts of the air carrier's manual

(3) Where to find the master listing

(4) Where certain manuals are located

b. Types and Identification of Manuals

(1) Hard copies

(2) Computerized manuals; CD‑ROM

c. Location of Manuals

(1) Required on aircraft

(2) Required software, if applicable

(3) Required for crewmembers

(4) Microfiche reader

(5) Required at stations

 

2. CERTIFICATE MANAGEMENT TEAM

a. Key Personnel

(1) Listing (name and phone number of all)

(2) PIs (including Principal Security inspector (PSI) and regional hazmat branch managers)

 

 

 


Figure 10‑46, Air Carrier‑Specific Familiarization Briefing Outline of Subjects (Continued)

General Topics—All Specialties (continued)

(Recommended Minimum Hours—8)

d. Distribution and Revision

(1) Determining current revision status

(2) Use of computer, if applicable

(3) What method is used to issue revisions?

(4) Tracking responsibilities

e. Alerts and Bulletins

(1) Method to determine current status

(2) Transmission of bulletins and revisions

 

8. FLIGHT DECK PROCEDURES

a. Checklist Location and Use

(1) Flight Deck flows

b. Quick Reference Handbook Location and Use

c. Safety Briefing

d. Crew Briefing; Communication

e. Required Paperwork/Documentation

(1) Location of logbooks (flight deck/cabin)

(2) Location of minimum equipment list (MEL)

(3) Airworthiness release

(4) Placards

f. Unique Fleet/Air Carrier Procedures

g. Airborne Communications Addressing and Reporting System (ACARS)

(1) Weight and balance

(2) Release amendments

(3) Communications

5. SECURITY AND ACCESS

a. Access to Ramp and Facilities

(1) Site‑specific requirements

(2) Air carrier’s security coordinators

b. ID Badges

c. Cockpit Keys

d. Security Alerts for Travel Advisories

6. HAZARDOUS MATERIALS

a. Acceptable Shipments

b. Documentation

c. Location Verification

d. Company Material (COMAT)

 

9. CABIN PROCEDURES

a. Exit Seating

b. Emergency Equipment

(1) Location

(2) Preflight, if applicable, for flight attendants

c. Markings and Placards

d. Carry‑On Baggage

e. Special Procedures

f. Medical Emergencies

(1) Medical oxygen

(2) Medlink

(3) AED (defibrillators)

g. Couriers

h. Cargo/Animal Handlers

i. Cockpit/Cabin Communications

j. Carriage of Weapons

(1) Forms and procedures

7. EN ROUTE PROCEDURES

a. Jumpseat Authorization and Procedures

(1) Jumpseat operation

(2)Radio operation; headset location and use

b. Requirement for International Travel

(1) Country clearance forms

Passport and visa

 


Figure 10‑46, Air Carrier‑Specific Familiarization Briefing Outline of Subjects (Continued)

Specific Topics—All Specialties

(Recommended Minimum Hours—8)

1.AIR CARRIER PROGRAMS

a. Deicing

(1) General procedures and training

(2) Paperwork

b. Fueling

(1) General procedures and training

(2) Paperwork

(3) Passenger handling during fueling

(4) Bonding and grounding

c. Pushback/Powerback Procedures

d. International Procedures

(1) Crew check‑in time

(2) Crew complement

(3) Flight/duty and rest computation

(4) General declaration

(5) Passport and visa requirements

e. Special and Ferry Flight Procedures

f. Cargo Operations

g. Security

(1) Hijack procedures

(2) Interference with crewmembers

4. OPERATIONS SPECIFICATIONS

a. Exemptions and Deviations

b. Special Areas of Operations

c. Special Authorizations and Programs

(1) Powerback procedures

(2) Single‑engine taxi

(3) Extended Operations (ETOPS)

(4) Areas of magnetic unreliability (AMU)

(5) Lower Landing Minimums

(6) Minimum Navigation Performance Standards (MNPS)

(7) Flight Operations Quality Assurance (FOQA)

(8) Aviation Safety Action Program (ASAP)

(9) Reduced vertical separation minimums (RVSM)

(10) Cat III procedures

2. RECORDS AND REPORTING

a. General

(1) Format: paper, microfiche, electronic

(2) Electronic signatures

(3) Security issues

(4) Custody and retention

3. STATION FACILITIES

a. Manuals

b. Fueling Equipment and Facilities

c. Maintenance Support

d. Contract Services

e. Passenger and Baggage Screening

f. Cargo

g. Marshalling and Ground Handling


Figure 10‑46, Air Carrier‑Specific Familiarization Briefing Outline of Subjects (Continued)

Operations and Cabin Safety Topics

(Recommended Minimum Hours—8 to 16)

1. FLIGHT OPERATIONS PROGRAMS

a. Flight Planning and Documentation

(1) Performance/operating limits

(2) Operational release

(3) Format of the release package

(4) Supplemental operations

(5) Passenger manifest

(6) Weather

(7) Weight and balance

(8) Documentation transmittal

b. Dispatch and Flight Following

(1) Centralized procedures

(2) Shared procedures

c. MEL/Configuration Deviation List (CDL) System/Deferral Process

4. CABIN SAFETY

a. Flight Attendant Duties/Cabin

(1) Supernumeraries

(2) Wet lease operations

(3) Reporting discrepancies

(4) Seatbelt discipline

(5) Child restraint

(6) Smoking requirements

(7) Number of required flight attendants

(8) Briefing requirements

(9) Reporting of mechanical discrepancies

(10) Sterile cockpit

b. Passenger Handling

(1) Interference with crewmember programs

(2) Passengers who may appear intoxicated

c. Carry‑On Baggage

(1) Screening

(2) Carry‑on baggage program

(3) Regional airline differences

d. Exit Seating

(1) Announcements; briefing cards

(2) Interpreters

e. Gate Agent Procedures

(1) Passenger service

(2) Supplemental operations

f. First Aid and Medical

(1) Medlink procedures

(2) CPR training

(3) Equipment required

Other equipment

 

2. TRAINING AND QUALIFICATIONS

a. Overview

(1) Operations specifications (OpSpecs)/specific training requirements

(2) Types of training conducted (wet lease, Advanced Qualification Program (AQP))

b. Training Facilities and Equipment

c. Key Fleet Personnel

d. Documentation of Personnel Requirements and Training

e. Outsource Training

3. REST AND DUTY TIME

a. Flight Crew

(1) Records and reporting

(2) Scheduling

b. Cabin Crew

(1) Records and reporting

(2) Scheduling

c. Dispatch

Records and reporting

Scheduling

 


Figure 10‑46, Air Carrier‑Specific Familiarization Briefing Outline of Subjects (Continued)

Maintenance and Avionics Topics

(Recommended Minimum Hours—8 to 16)

1. MAINTENANCE SYSTEMS

a. Air Carrier Procedures

(1) General procedures manual

b. Suspected Unapproved Parts (SUP)/Parts and Materials

(1) Site receiving inspection

(2)Scrap parts procedures

c. Ground Handling/Taxi/Run‑Up Procedures

d. Calibrated Tools and Test Requirements

e. Maintenance Assessments

f. Required Equipment

(1) Aircraft

(2) Fly away kit

(3) Maintenance library

f. Training Programs

(1) Overview of qualifications and training

(2) OpSpecs/specific training

(3) Types conducted

(4) Training facilities/equipment

(5) Key personnel

g. Airworthiness Release

(1) Format of the release package

(2) Supplemental operations

(3) Maintenance releases

h. Weight and Balance

i. MEL/CDL

(1) Preamble; general; revision status

(2) Deferral and tracking

(3) Coordination with maintenance control

(4) Action required for inoperative items

(5) Interim actions; DENT program

j. Special Programs

(1) ETOPS

(2) AMU

(3) Lower landing minimums

(4) MNPS

(5) ASAP

(6) FOQA

(7) RVSM

(8) Reliability program

(9) Repeat maintenance items

(10) Required inspection items

(11) Continuous Analysis Surveillance

(12) Coordination Agency for Supplier’s Evaluation

(13) Corrosion Prevention Control Program (CPCP)

(14) Aging aircraft program

(15) Supplemental Inspection Document /Supplemental Structural Inspection Document

2. RECORDS AND REPORTING

a. Maintenance Logbooks/Recording

b. Aircraft Records/Aircraft Listing

c. Mechanical Interruption Summary

d. Service Difficulty Reports

3. OPERATIONS SPECIFICATIONS

4. STATION FACILITIES

a. Parts and Equipment

b. Deicing Procedures

5. MAINTENANCE ORGANIZATION

a. Maintenance Control

b. Engineering Systems and Forms

c. Internal Evaluation and Quality Assurance

d. Airworthiness Directive Management

e. Contract Maintenance and Repair Stations

RESERVED. Paragraphs 10‑156 through 10‑170.


Volume 10 Air Transportation Oversight System

CHAPTER 2 Procedures for Design and Performance Assessment

Section 4  Design Assessment Data Collection

Figure 10‑48, Module 4. Data Collection

 

Module 4. Design Assessment Data Collection flowchart

10-171 Introduction. The objective of this process module is to collect design data in accordance with the Comprehensive Assessment Plan (CAP) and principal inspector (PI) or certification project manager (CPM) instructions. Data collected on the Safety Attribute Inspection (SAI) Data Collection Tools (DCT) and/or the Constructed Dynamic Observation Reports (ConDOR) are used to assess air carrier or applicant system design.

10-172 Collect Required Data. (See flowchart process step 4.1.) Design assessment data are collected using SAI DCTs by trained and qualified Federal Aviation Administration (FAA) Operations, Airworthiness, Cabin Safety, and/or Dispatch inspectors assigned to an Air Transportation Oversight System (ATOS) Certificate Management Team (CMT) or Certification Project Team (CPT). A team of inspectors or a single inspector may complete the SAI. PIs should consider the nature and complexity of the element under scrutiny, and whether or not the single‑inspector method is appropriate in each case. Each SAI receives a team coordinator (TC).

A.     Design Assessments Conducted in Partnership With the Air Carrier. The air carrier may partner with the FAA to complete the design assessment. When collaborating on the assessment, air carrier personnel are active participants and working members of the SAI team, and determine and resolve element evaluation issues with the PI.

1)      When an air carrier collaborates with the FAA in the design assessment, all key management officials (as defined in Title 14 of the Code of Federal Regulations (14 CFR) § 119.65 of the air carrier must receive a briefing on ATOS policies and procedures of Advisory Circular (AC) 00‑58, Voluntary Disclosure Reporting Program, current edition, prior to the beginning of the assessment. The air carrier’s management must understand the requirement to use the appropriate SAI to validate the comprehensive fix of any identified air carrier element deficiency that involves an apparent violation of FAA regulations.
2)      When an air carrier actively participates as a working member of the evaluation team, apparent violations of FAA regulations discovered during the assessment and subsequent enforcement action is governed by the provisions of AC 00‑58.
3)      When an air carrier elects to not actively participate in the assessment as a working member of the SAI team, the provisions and protections contained in AC 00‑58, do not apply to apparent violations of FAA regulations discovered during the specified evaluation period.

B.     Coordinate SAI Team and Establish Communication Methods. The SAI TC decides how the team communicates. Coordination and communication are especially important if members are spread among different locations. After reviewing the PI instructions, the TC organizes a team meeting. This meeting can be in person, over the phone, or by other means.

C.     Distribute and Schedule Tasks. Data are collected by a team of inspectors or a single inspector. The tasks may be distributed by element, safety attribute, individual question, or some combination to allow the timely collection of accurate data. The TC also ensures that activities such as air carrier personnel interviews are not redundant and that all activities are completed to accurately answer the questions on the SAI. The TC distributes tasks among the SAI team and develops a timeline to complete the assigned data collection activities. The TC, in conjunction with the remaining SAI team members, divides and distributes the SAI activities, but is not the supervisor. If the TC encounters difficulties with a team member during an assessment, the situation is elevated through his or her frontline manager for resolution.

D.    Prepare to Perform Assigned Data Collection Activities. Inspectors prepare for design assessment data collection by reviewing, at a minimum, the following:

1)      PI or CPM instructions.
2)      Current SAI DCTs, available online, for the element to be assessed.
3)      Specific regulatory requirements (SRR) related to the element.
4)      Relevant FAA guidance, such as orders and advisory circulars.
5)      Air carrier or applicant policies and procedures (e.g., manuals, operations specifications (OpSpecs), training programs) for the element being assessed.
6)      The results of previous design and performance assessments.

E.     Perform Data Collection Using the SAI. Each SAI team member submits their responses into ATOS automation after they complete their data collection activities. Communication between team members is essential, but sharing answers is not necessary or desirable because of possible duplication. SAIs should be completed within the timeframes established by the PI or CPM so the data are available to for timely completion of the design assessment. Inspectors follow the general instructions for using SAI DCTs.

1)      Purpose. Most elements represent processes the air carrier performs. The purpose statement defines the intent of the element and the scope of the certificate holder’s responsibility. Policies and procedures describe a certificate holder’s process. The safety attributes contained in each SAI are employed to organize the content of the SAI and to aid the PI in determining the acceptability or approvability of the air carrier’s process.
2)      Specific Instructions. Some DCTs may contain specific instructions for additional training, experience, or qualifications that may be helpful in determining inspector assignments. Specific instructions may also include additional references, background information, or manuals that should be reviewed, as well as suggestions for specific types of activities and/or reporting instructions.
3)      Specific Regulatory Requirements. An SRR is a regulation from 14 CFR that is refined to its most specific level. Each SAI includes SRRs as references for the inspector. The SRRs were used during the development of the SAI DCTs to help define the function of the element and develop many of the procedure attribute questions. Some of these regulations pertain to initial certification while others relate to ongoing operations.
a)      Questions that are based on regulatory requirements have an SRR appended to them. Therefore, answering No to such a question may require an enforcement investigation for a certificated air carrier, or may lead to rejecting a program or authorization proposed by an applicant.
b)      Questions that do not have an SRR appended to them are not tied to a literal regulatory requirement, but are based on system safety principles. A No answer to this type of question, while not a violation, may indicate a hazard with an increased level of risk that may require additional CMT or CPT action, including a decision to withhold approval or acceptance of a system or program.
4)      Related 14 CFR parts and FAA Policy/Guidance. Reference to related CFRs means 14 CFR parts other than those categorized as SRRs. Related CFRs and FAA policy/guidance are included for background information that is necessary to accomplish the inspection. The inspector should also review the related elements that are included in the associated Element Performance Inspection. The purpose of this review is to notify the inspector of any other elements that may interface with this SAI to ensure that related procedures do not conflict.

Note:   DCT users are responsible to ensure they reference the current edition of the guidance.

5)      Tasks. Each attribute section of the DCT contains the statement, “To meet this objective, the inspector must accomplish the following task(s).” Various activities comprise a single task. The following are some of the tasks that an SAI could include.
a)      Review the information listed in the Supplemental Information section of this DCT. A list of the SRRs, related CFRs, and FAA policy/guidance documents that are pertinent to the questions of the DCT for a given element are provided in the Supplemental Information section of the SAI. Regulatory and FAA policy/guidance references also appear at the question level. The inspector reviews the related CFRs and FAA policy and guidance documents included with each SAI.
b)      Review the duties for management and other personnel identified by the certificate holder who accomplish the (element name) process.
c)      Review the certificate holder’s system documentation to ensure that it contains policies, procedures, instructions, and information necessary for the (element name) process. The inspector should review and gain an understanding of the certificate holder’s policies, procedures, instructions, and information for the element he/she is inspecting in order to plan their inspection activities. This usually involves reviewing sections of the appropriate OpSpecs, training programs, or other documents, as well as the manuals related to the process.
d)      Review the interfaces associated with the (element name) process that have been identified along with the individual questions in the Procedures section (1) of this DCT. Some questions in the Procedures section contain references to interfaces in the Related Design Job Task Item (JTI). The inspector reviews those references to identify the interfaces in the certificate holder’s manual.
e)      Identify the person who has overall responsibility for the (element name) process. The inspector must sufficiently understand the certificate holder’s system to know who is responsible for the quality of each process.
f)        Identify the person who has overall authority for the (element name) process. The inspector must sufficiently understand the certificate holder’s system to know who has the authority to establish or modify each process.
g)      Review the duties and responsibilities of the person(s) documented in the certificate holder’s manual. The inspector must sufficiently understand the certificate holder’s system to know the duties and responsibilities of individuals assigned the responsibility for each process or authority to change each process.
h)      Review the appropriate organizational chart. The inspector must sufficiently understand the certificate holder’s organization to identify who has the authority and responsibility for certain processes. Often, many organizations disperse authority and responsibility. A person can be an individual, a department, a committee, or a position.
6)      Questions. Each SAI lists a series of questions for the SAI team to answer based on their observations during the various activities. Questions on each activity report are answered in response to what was observed on that single activity. The DCTs are not designed to be checklists of questions that are asked directly of the certificate holder’s personnel.
7)      Job Task Items. JTIs are included with questions for inspector reference only. JTIs aid the inspector in determining if a certificate holder’s written policies, procedures, instructions, and information are adequate. The inspector is not expected to respond to each JTI individually. The JTIs listed below each question are there to aid inspectors in answering the question. If a question appears to be nonspecific, review the associated JTI to identify the specific requirements.
8)      SAI Sections. Each SAI attribute section includes the statement, “To meet this objective, the inspector must answer the following questions.” The following paragraphs describe some of the content in each section of the DCT.
a)      Procedures Attribute. To respond to the questions in this section, the SAI team must gain a thorough understanding of the certificate holder’s policies, procedures, instructions and/or information for this specific process. The purpose is to determine the method used by the certificate holder to accomplish the process associated with the element. The team is asked to determine if written procedures exist, if the procedures contain sufficient detail, and if they comply with the CFRs. A reference to the system documentation where these procedures are located must be determined and entered into the text box that becomes available when a Yes response is entered into the ATOS database. This section of the DCT includes a list of questions concerning the procedures for this process. Many of these questions have SRRs for this process, although the certificate holder may have some latitude in implementing others. For this reason, a response of No to one of these questions does not necessarily mean that the company is not complying with a regulation or that any action is required.
b)      Controls Attribute. Controls are checks and restraints that must be built into the certificate holder’s processes to help ensure that the desired result is continuously achieved. While most controls are not regulatory, they are an important safety attribute with desirable features that help to reduce unacceptable levels of risk. Each SAI lists a series of controls. Some common types of controls are flags, data system backups, authorized signatures, separation of duties, or a final review. It is important to note that certificate holders must be able to show the effectiveness of their controls. Few of these controls have their basis in SRRs. For this reason, a response of No to one of these questions does not necessarily mean that the company is not complying with a regulation or that any action is required.
c)      Process Measurement Attribute. The purpose of this attribute is to ensure the certificate holder uses an internal evaluation function to detect, identify, and eliminate or control hazards and the associated risk. Each SAI lists process measurements that are specific to that element. Process measurements are designed to determine if the certificate holder’s policies, procedures, and controls are achieving the desired results or the purpose for that element. In most cases, process measures are nonregulatory. For this reason, a response of No to one of these questions, while not a violation, may indicate a hazard with an increased level of risk, which may require additional CMT or CPT action.
d)      Interfaces Attribute. This section focuses on the interactions between air carrier processes. Each SAI DCT lists some of the interfaces that are specific to that element. There may be additional interfaces the inspection team identifies that should be listed on the DCT. The first question asks if the certificate holder has recognized and addressed the interfaces identified in section 1, Procedures Attribute. The second question asks if the certificate holder’s manual documents a method for assessing the impact of any changes to the associated interfaces within the air carrier’s organization.

Note:   In the Procedures section, some questions have references to interfaces in the Related Design JTIs. The inspector will need to refer to those questions tagged with interfaces to answer the question.

e)      Management Responsibility and Authority. This section asks a series of questions about a clearly identifiable person who is responsible for the quality of the process or who has authority to establish and modify the process. The first two questions require that a name be entered. Often, many organizations disperse authority and responsibility. A person can be an individual, a department, a committee, or a position (such as vice president of flight operations). The intent is to identify the highest level person (at the appropriate level within the organization) who is responsible or has the authority for that particular element of the certificate holder’s system. The remaining questions for this section ask if the duties and responsibilities and qualification standards are clearly documented.

F.      Perform Data Collection Using Constructed Dynamic Observation Reports. The PI or the CPM may direct inspectors to collect design assessment data using a ConDOR to address focused or unique situations. The aviation safety inspector performs the appropriate tasks listed on the ConDOR for each inspection to accurately answer all the questions required by the PI or CPM.

10-172 Were Issues of Regulatory Noncompliance Identified? (See flowchart process step 4.2.) During data collection activities for design assessment, the inspector may identify issues of regulatory noncompliance that require an immediate response.

A.     Not all No responses to SAI DCT questions indicate regulatory noncompliance. A No answer can also mean inadequate inclusion of the safety attributes in the area being evaluated or the certificate holder’s approved or accepted procedures are inadequate.

B.     A No answer does not necessarily equate to an unsafe condition or a regulatory violation, unless that particular No has a regulatory basis (including regulatory intent) and the inspector observed a possible violation or unsafe condition.

10-173 Notify the PI/CPM of Regulatory Noncompliance. (See flowchart process step 4.3.) After identifying an issue of regulatory noncompliance that requires an immediate response, the inspector notifies the PI or CPM. The PI follows the guidance outlined in Order 2150.3, Compliance and Enforcement Program, to address the issue.

10-174 Were Unrelated Safety Issues Observed? (See flowchart process step 4.4.) While collecting the required data for the design assessment, the inspector may observe unrelated safety issues, which are outside the scope of the SAI DCT questions. The inspector takes appropriate action, including communicating the safety issue to air carrier or applicant personnel and the PI or CPM. ATOS does not change an inspector’s responsibility to investigate and act on safety or regulatory concerns.

A.     Significant issues or items of immediate concern should be promptly conveyed to the appropriate PI or CPM.

B.     Significant issues or items of immediate concern regarding all aspects of the air transportation of hazardous materials must be promptly conveyed to the appropriate PI or CPM who will coordinate with the regional hazardous materials branch manager.

C.     Significant issues or items of immediate concern regarding all aspects of the air carrier’s or applicant’s drug testing program and alcohol misuse prevention program must be promptly conveyed to the appropriate PI or CPM who then coordinates with AAM‑800.

10-175 Document Unrelated Safety Issues via a Dynamic Observation Report. (See flowchart process step 4.5.) The Dynamic Observation Report (DOR) is used to document observations associated with unrelated safety issues found during data collection activities. DORs are not a substitute for the planned inspections and are not intended for routine use. Managers and inspectors use the DOR in the following situations:

A.     Safety issues unrelated to the ATOS element being assessed.

B.     Safety issues for which there is not an applicable ATOS element or DCT question.

C.     Safety issues for an air carrier to which the inspector is not assigned.

D.    Specific inspection events as directed by a handbook bulletin or other national directive.

E.     Inspections associated with incidental travel from one location to another to perform official business; or if the inspector is not a member of the CMT for the air carrier that operates the aircraft.

RESERVED. Paragraphs 10‑176 through 10‑190.


Volume 10 Air Transportation Oversight

CHAPTER 2 Procedures for Design and Performance Assessment

Section 5  Performance Assessment Data Collection

Figure 10‑49, Module 4. Data Collection

 

Module 4. Performance Assessment Data Collection flowchart

10-191 Introduction. The objective of this process module is to collect performance data in accordance with the Comprehensive Assessment Plan (CAP) and principal inspector (PI) or certification project manager (CPM) instructions. Data collected on the Element Performance Inspection (EPI) and/or the Constructed Dynamic Observation Report (ConDOR) Data Collection Tool (DCT) are used to assess the system performance of an air carrier or applicant.

10-192 Collect Required Data. (See flowchart process step 4.1.) Performance assessment data are collected using EPIs by trained and qualified Federal Aviation Administration (FAA) Operations, Airworthiness, Cabin Safety, and/or Dispatch inspectors assigned to an Air Transportation Oversight System (ATOS) Certificate Management Team (CMT) or a Certification Project Team (CPT).

A.     Prepare to Perform Assigned Data Collection Activities. Inspectors prepare for collecting performance assessment data by reviewing, at a minimum, the following:

1)      PI or CPM instructions.
2)      Current EPI DCTs (available online) for the element to be assessed.
3)      Specific regulatory requirements (SRR) related to the element.
4)      Relevant FAA guidance, such as orders and advisory circulars.
5)      Air carrier or applicant policies and procedures (i.e., manuals, operations specifications (OpSpecs), and training programs) for the element being assessed.
6)      The results of previous design assessments and performance assessments.

B.     Perform Data Collection Using the EPI. Inspectors submit their responses to EPI DCT questions into ATOS automation after they complete their data collection activities. EPIs should be completed within the timeframes established by the PI or CPM so that the data are available to complete the performance assessment on a timely basis. Inspectors follow the general instructions for using EPIs.

C.     EPI Data Collection Tool Sections. EPIs have four sections: Element Summary Information, Supplemental Information, Performance Observables, and Management Responsibility and Authority. The objective of the Performance Observable section is to determine if the air carrier or applicant follows its procedures, controls, process measures, and interfaces for the process, and to determine if the process is functioning as designed and achieving the desired results. The objective of the Management Responsibility and Authority section is to determine if air carrier management personnel are qualified and knowledgeable, and recognize their responsibility and/or authority for the process.

1)      Element Summary Information.
a)      Purpose. Most elements represent processes performed by the air carrier. The purpose statement defines the intent of the element and the scope of the certificate holder’s responsibility. A certificate holder’s process is described by policies and procedures. The questions in each EPI are designed to aid the PI in making a judgment about the performance of the air carrier’s process.
b)      Objective. The objective tells us the scope of the inspection in general terms and identifies the FAA’s responsibility.
c)      Specific Instructions. Some DCTs may contain specific instructions for additional training, experience, or qualifications that may be helpful in determining inspector assignments. Specific instructions may also include additional references, background information, or manuals that should be reviewed, as well as suggestions for specific types of activities and/or reporting instructions.
d)      Specific Regulatory Requirements. An SRR is a regulation from Title 14 of the Code of Federal Regulations (14 CFR) that is refined to its most specific level. Each EPI includes SRRs as references for the inspector. The SRRs were used during the development of the DCTs to help define the function of the element and develop questions to measure performance. Some of these regulations pertain to initial certification while others relate to ongoing operations.
e)      Related EPIs. Some DCTs have a list of related elements that are provided primarily for reference and background information. Inspectors should review the DCTs for related elements. There may be situations when activities for one EPI may be accomplished in conjunction with activities of related EPIs.
2)      Supplemental Information.
a)      SRRs that are based on regulatory requirements have an SRR appended to the job task items. Answering No to such a question may require an enforcement investigation for a certificated air carrier, or may lead to rejection of a program or an authorization proposed by an applicant.
b)      Related 14 CFR parts and FAA Policy/Guidance. Reference to related CFRs means 14 CFR parts other than those categorized as SRRs. Related CFRs and FAA policy/guidance are included for background information that is necessary to accomplish the inspection. The inspector should also review the related elements. The purpose of this review is to notify the inspector of any other element that may interface with this EPI to ensure that related procedures do not conflict.
3)      Questions. Each EPI section lists a series of questions for the inspector to answer based on his or her observations during the various activities. Questions on each activity report are answered in response to what was observed on that single activity. Based upon the scope of the EPI and the complexity of the certificate holder’s process, inspectors should develop a plan of research, observation, inspection, and evaluation that ensures quality data are collected.
4)      Job Task Items. Job task items (JTI) are included with questions for inspector reference only. JTIs aid the inspector in determining if a certificate holder is following its written policies, procedures, instructions, and information; and if the desired results are being achieved. The inspector is not expected to respond to each JTI individually. The JTIs listed below each question are there to aid inspectors in answering the question.

Note:   DCT users are responsible to ensure they reference the current edition of the guidance.

D.    Completing the EPI Performance Observables Section. The inspector should complete the tasks identified on the DCT and answer each question in the section at least once.

1)      Tasks. Each DCT contains the statement, “To meet this objective, the inspector must accomplish the following tasks.” The DCT then lists certain tasks that should be completed during the inspection. Each task is made up of various activities. Some common tasks that may be listed on an EPI include:
a)      Review the information listed in the Supplemental Information section of this DCT. The Supplemental Information section of the DCT contains a list of the SRRs, related CFRs, and FAA policy or guidance documents that are pertinent to the questions of the DCT for a given element. Regulatory and FAA policy or guidance references will also appear at the question level. The inspector reviews the related CFRs and FAA policy and guidance included with each EPI.
b)      Review the policies, procedures, instructions and information for the (element name) process contained in the certificate holder’s manual. The inspector reviews and gains an understanding of the certificate holder’s policies and procedures for the element they are inspecting to plan inspection activities. This review usually involves sections of the appropriate OpSpecs, manuals, training programs, or other guidance. A subsequent question asks the inspector if the certificate holder follows its policies and procedures.
c)      Review the last accomplished associated Safety Attribute Inspection (SAI) for this element with emphasis on the controls, process measurements, and interface attribute section responses. A review of the associated SAI DCT and the results of any complete SAIs provide the inspector with useful information about the certificate holder’s systems, and can help the inspector to identify areas of potential risk. The Controls Attribute section of each SAI lists checks and restraints that must be built into the certificate holder’s process to help ensure that the desired results are consistently achieved. Most controls are not regulatory, but they are an important safety attribute with desirable features that help to reduce risk. The inspector is asked in a subsequent question if the controls are being followed.
d)      Observe the (element name) process to gain an understanding of the procedures, instructions, and information contained in the certificate holder’s manual. Each element defines a specific program or process that achieves certain results as described in the Purpose section of the EPI. The inspectors must plan to conduct various activities that will assist them in determining if the policies and procedures are being followed, and whether they are effective. For example, in assessing the results of a deicing EPI, the inspector may perform various activities at different locations. These activities may include inspecting the storage of deicing materials at station facilities, observing deicing in progress on various aircraft from the ramp, watching deicing procedures during cockpit or cabin en route inspections, or visiting the operations center during icing conditions.
e)      Discuss the process with air carrier or contract personnel who perform the process.
2)      Questions. The following paragraphs describe some of the typical questions in the performance observables section of the EPI
a)      Were the following performance measures met? Each EPI lists performance measures that are specific to that element. Performance measures determine if the certificate holder’s process is achieving its purpose.
b)      Were the certificate holder’s policies and procedures, instructions, and information followed? The inspector must gain a thorough understanding of the certificate holder’s policies and procedures to answer this question.
c)      Were the (element name) process controls followed? This question refers to the controls that are identified in the associated SAI controls attributes section. Controls are checks and restraints that must be built into the certificate holder’s process to help ensure that the desired results (purpose of the element) are consistently achieved. Reviewing those controls helps the inspector answer this question. Not all the controls are observed during each activity.
d)      Did the records for the process comply with the instructions provided in the certificate holder’s manual? The inspector must sufficiently understand the air carrier’s system to know which records and reports are generated or used during the processes and procedures for the element. A representative sample of these records should be reviewed and assessed for compliance with regulations and the certificate holder’s policies, procedures, instructions, and information.
e)      Were the process measurements for the process effective in identifying problems or potential problems and providing corrective action for them? The inspector reviews the Process Measurements section of the SAI and certificate holder’s manuals to understand which measures the certificate holder has designed into the process. The inspector conducts activities to determine if the process measurements were effective in identifying and providing corrective action for problems or potential problems.
f)        Did personnel properly handle the associated interfaces? This question focuses on the interactions between the process under inspection and other processes within the certificate holder’s organization.

E.     Completing the Management Responsibility and Authority Section of the EPI. This section helps determine if the person identified by the certificate holder as having responsibility and/or authority for the process is qualified, knowledgeable, and recognizes that responsibility and/or authority.

1)      Tasks. The following are some of the tasks that may be listed in this section.
2)      Review the appropriate organizational chart and the documented duties and responsibilities for the process.
a)       Identify the person who has overall responsibility for the process. The intent is to identify the highest level person within the organization who is responsible for the quality of the process. That person may or may not have authority to change the process.
b)      Identify the person who has overall authority for the process. The intent is to identify the highest level person within the organization who has authority to change the process. That person may or may not be responsible for the quality of the process.

Note:   A person can be an individual, department, committee, or position.

c)      Discuss the process with the management personnel. In completing this task, the inspector has discussions with the persons who have responsibility and authority to determine if they understand the policies and procedures for the process.
d)      Evaluate the qualifications and work experience of the management personnel. The purpose of this task is to determine that the individual responsible for, or with the authority to establish or modify a process meets the qualifications to hold that position. In some instances, there may be regulatory requirements for those qualifications. In other instances, the qualifications may be demonstrated by reviewing the individual’s FAA certificate, training records, or particular background or expertise.
3)      Questions. The following paragraphs describe some of the typical questions in the management responsibility and authority section of the EPI.
a)      Questions 1 and 2. The purpose of these questions is to identify by name and title the person who is responsible for the quality of the process and the person who has the authority to establish and modify the process.
b)      Questions 3–10. Answer these questions if there are changes in personnel or the program that affect the responsibility and authority attributes for the process. If there have not been changes in personnel or the program that affect the responsibility and/or authority attributes, reporting inspectors can select an auto‑fill feature to mark these questions No Change.

10-193 Perform Data Collection Using Constructed Dynamic Observation Reports. The PI or the CPM may direct inspectors to collect performance assessment data using a ConDOR to address focused or unique situations. The aviation safety inspector performs the appropriate tasks listed on the ConDOR for each inspection to accurately answer all the questions required by the PI or CPM.

10-194 Were Issues of Regulatory Noncompliance Identified? (See flowchart process step 4.2.) During data collection activities for performance assessment, the inspector may identify issues of regulatory noncompliance that require an immediate response. Not all No responses to EPI DCT questions indicate regulatory noncompliance.

10-195 Notify PI or CPM of Regulatory Noncompliance. (See flowchart process step 4.3.) After identifying an issue of regulatory noncompliance that requires an immediate response, the inspector notifies the PI or CPM. The PI follows the guidance outlined in FAA Order 2150.3, Compliance and Enforcement Program, to address the issue.

10-196 Were Unrelated Safety Issues Observed? (See flowchart process step 4.4.) While collecting the required data for the performance assessment, the inspector may observe unrelated safety issues, which are outside the scope of the EPI DCT questions. The inspector takes appropriate action, including communicating the safety issue to air carrier or applicant personnel and the PI or CPM. ATOS does not change an inspector’s responsibility to investigate and act on safety or regulatory concerns.

A.     The appropriate PI or CPM should be promptly notified of significant issues or items of immediate concern.

B.     Significant issues or items of immediate concern regarding all aspects of the air transportation of hazardous materials must be promptly conveyed to the appropriate PI or CPM who will coordinate with the regional hazardous materials branch manager.

C.     Significant issues or items of immediate concern regarding all aspects of the air carrier’s or applicant’s Drug Testing Program and Alcohol Misuse Prevention Program must be promptly conveyed to the appropriate PI or CPM who then coordinates with AAM‑800.

10-197 Document Unrelated Safety Issues via a Dynamic Observation Report. (See flowchart process step 4.5.) The Dynamic Observation Report (DOR) is used to document observations associated with unrelated safety issues found during data collection activities. DORs are not a substitute for the planned inspections and are not intended for routine use. Managers and inspectors use the DOR in the following situations:

A.     Safety issues unrelated to the ATOS element being assessed.

B.     Safety issues for which there is not an applicable ATOS element or DCT question.

C.     Safety issues for an air carrier to which the inspector is not assigned.

D.    Specific inspection events as directed by a handbook bulletin or national directive.

E.     Inspections associated with incidental travel from one location to another to perform official business; or if the inspector is not a member of the CMT for the air carrier that operates the aircraft.

RESERVED. Paragraphs 10‑198 through 10‑212.


Volume 10 Air Transportation Oversight System

CHAPTER 2 Procedures for Design and Performance Assessment

Section 6  Design and Performance Assessment Data Reporting

Figure 10‑50, Module 5. Data Reporting for Design and Performance Assessment

10-213 Introduction. The data reporting process module defines the method for transferring data collected by inspectors into the Air Transportation Oversight System (ATOS) database. Safety Attribute Inspection (SAI), Element Performance Inspection (EPI), and Constructed Dynamic Observation Report (ConDOR) and Dynamic Observation Report (DOR) data are used to assess air carrier or applicant system design and performance, and identify any safety issues.

10-214 Enter SAI, EPI, ConDOR, or DOR in Accordance With Data Quality Guidelines. (See flowchart process step 5.1.) SAI activity, EPI activity, ConDOR, or DOR data are reported in the ATOS database. Enter data into the ATOS database within three business days of completing an activity, or as soon possible. (See Table 10‑1, Data Quality Guidelines).

A.     Activities Recorded in the ATOS Database. The Certificate Management Team (CMT) or Certification Project Team (CPT) records complete ATOS data collection activities in the ATOS database. The same data should never be entered in both the ATOS database and Program Tracking and Reporting Subsystem (PTRS). If reporting an ATOS activity, it is entered into the ATOS database utilizing SAIs, EPIs, ConDORs, and DORs. Any followup reporting (e.g., enforcement investigations or self disclosures) is reported in the appropriate system.

B.     Recording the Use of Form 8430‑13, “Request for Access to Aircraft”. Never report the use of Form 8430‑13 for the same en route inspections in both the ATOS and PTRS databases.

1)      ATOS Database. CMT members who perform en route activities that require the use of Form 8430‑13 to access aircraft cockpits or cabins for ATOS data collection should report such activities using the EPI in the ATOS database. Do not document the 8430‑13 information on more than one EPI when performing multiple EPIs concurrently.
2)      Dynamic Observation Report. Inspectors who are not members of the CMT for the air carrier operating the aircraft should record cockpit or cabin en route data using either an “Element-Based DOR” or “Other DOR”. In addition, either of these DORs may be used if the cockpit or cabin en route inspection is incidental to ATOS data collection (e.g., traveling from one location to another to perform official business).
3)      Element-Based DOR. This format is used for recording the use of Form 8430­13 and reporting any unplanned (or negative) observations for an existing element by answering the appropriate Data Collection Tool (DCT) questions for that element.
4)      Other DOR. This format is used for recording the use of Form 8430‑13 when no negative observations are being reported or to record data that is not related to existing elements or DCT questions. Inspectors must describe their negative observations and resulting actions in the comments field and reporting inspector action taken text block.
5)      Program Tracking and Reporting System. Inspectors not assigned to a CMT subject to ATOS 1.2 policies and procedures should record the activity only in PTRS.

C.     Data Collection Activities. Data collection may involve a variety of activities over multiple dates at various locations. There should be a sufficient number of activities to answer all the questions and provide data for a thorough assessment by the principal inspector (PI) or the certification project manager (CPM).

1)      The SAI or EPI DCT master record comprises all of these individual SAI or EPI activity reports. A general rule of thumb is that any time the common data field information (date, location, aircraft, etc.) changes a new activity begins. Most activities are completed in a day.
2)      An activity is a snapshot of the air carrier’s or applicant’s system design or performance at that moment.
3)      It is not necessary to complete an activity report for each document evaluated; a single activity report may be submitted for each set of documents at one location for one day.
4)      Inspectors should only report observed data (what they see), not their interpretation (personal opinion).

D.    Dynamic Observation Report. DORs record observations outside the comprehensive assessment planning process and are not intended for routine use as a substitute for planned assessments. Managers should closely monitor the use of CMT resources for DORs.

1)      CMT members should use DORs only in the following situations:
a)      Single‑activity observations unrelated to the ATOS element being assessed.
b)      Unplanned observations when there is not an ATOS element or question that addresses the unique situation.
c)      Unplanned observations about a certificate holder the inspector is not assigned to inspect.
d)      Observations directed by a handbook bulletin or other national directive.
2)      Reporting DORs. Two options are available when recording data in a DOR. Inspectors select the most appropriate format based on the nature of the observations or information being recorded. Data quality guidelines apply to both reporting options.
a)      Element‑Based Observation. This format is used to record an unplanned observation for an existing element by answering the appropriate DCT questions for that element.
b)      Other Observations. This format is used to record data that is not related to existing elements or DCT questions. Inspectors can describe their observations and resulting actions in the common data field and text block.
3)      A DOR saved as final, is immediately available to all PIs and CMT managers and supervisors of the observed air carrier. It is sent to the data reviewer of the observed air carrier for review and is available for analysis to all Operations Research Analysts (ORA) and PIs. DORs are available to query along with EPI and SAI data.

E.     Constructed Dynamic Observation Reports. The ConDOR is a special‑purpose DOR constructed by the PI or CPM using DCT questions. The frontline manager assigns ConDORs to inspectors. The ConDOR contains instructions to inspect and report on specific areas of immediate concern.

F.      Reporting Data in ConDORs. All questions must be answered using the data quality guidelines. When an inspector submits a ConDOR as final, it is immediately available to all PIs and CMT managers and supervisors. The ConDOR is sent to the data reviewer for evaluation and is available for analysis to ORAs and PIs. ConDORs are available to query along with EPI and SAI data.

G.    Entering Data Collection Activities. The inspector who conducted the data collection activity, or a designated aviation safety assistant, (ASA) or aviation safety technician (AST) assigned to the CMT, enters the data into the ATOS database. Automation links all activity records to the reporting inspector. Refer to Table 4‑1, Specific Data Requirements Table, for more information on entries into specific data fields.

1)      Entering Common Data Field Information. Complete all common data fields that are relevant to the activity. Guidance for each common data field is provided in the specific data requirements table at the end of this chapter. After entries are made into the common data fields, the activity report that displays the element‑specific DCT questions for the selected SAI or EPI is accessed. Complete only those questions that can be answered from accomplishing the activity or observation.
2)      Inspector Actions Taken. Whenever a question is answered with a No response, the Inspector Action Taken field associated with that specific data reporting tool question is available and may be used as described below.
a)      Inspectors who observe an unsafe condition that could result in an accident or incident, or a violation of the regulations must intervene by notifying the appropriate air carrier or applicant personnel and the PI or CPM. Action taken involving and enforcement investigation is tracked in the PTRS. The Enforcement Investigation Report number and the PTRS record identification number should be referenced in the Inspector Action Taken block, and the activity report is saved as final.
b)      Inspectors should not enter a description of their actions to complete the particular activity being reported. The intent of this field is not to capture what records were looked at or the processes observed
c)      Actions entered in this field include, but are not limited to (1) notifying the appropriate air carrier or applicant personnel of a potential noncompliance, (2) initiating an enforcement investigation, (3) consulting with air carrier or applicant personnel to effect an action, or (4) notifying the PI or CPM.
3)      Record Responses to Questions. SAI DCT questions are answered with either a Yes, No, or Not Applicable (NA), or in the case of an EPI, Not Observed. Inspectors who are unsure of how to respond to a question should do additional research or conduct another activity to make a definitive determination.
a)      Yes Response. The DCT questions are written so that Yes is always a favorable response.

1.      SAI Yes Comments. After selecting a Yes response to an SAI question, inspectors must enter additional information in the Yes comments field. Yes comments are optional on EPIs.

2.      A yes response to any ConDOR SAI question also requires an explanation or comment.

3.      These comments describe where the information can be found in the system documentation of the air carrier or applicant and how the air carrier or applicant complies with the subject of the question. References should be specific so that it may be easily located by another inspector.

4.      The comments should be complete and descriptive. The comment field is not intended to capture negative, unsatisfactory, or qualifying (i.e., yes, but) information.

5.      Any negative wording in a Yes comment is inappropriate and probably indicates that the question should have been answered No. The comment field is not intended as a catchall for describing inspection activities.

b)      No Response. The DCT questions are written so that No always indicates an unfavorable response. For each No response, the inspector must provide an explanation that describes the observations causing the negative response. Explanations must be complete and descriptive so that someone knowledgeable within the air transportation industry can understand the comment without requiring additional information. Read the question through and answer it based on just the activity that was performed. The intent is not that a single No answer necessarily equates to an unsafe condition or a regulatory violation, unless that particular DCT question has a regulatory basis.

1.      ATA Code. When reporting an SAI or EPI activity No response, the ASI should select an Air Transport Association of America (ATA) code and appropriate four‑digit code.

2.      Drop‑Down Menu. A No response requires the inspector to select one or more potential problem areas from a drop‑down menu. The inspector must include an explanation in the No comments box for each area selected. If the available choices do not adequately describe your observation, select Other and provide an explanation in the comment block.

3.      No Explanations. No answers require an explanation of the “who, what, where, when, and how” that caused the response. No responses provide valuable information that, when rolled up and analyzed with other similar data, may lead to an increase in oversight even though no regulations were violated. The explanations are captured in the database and are analyzed for trends or patterns to determine if any action is required by the CMT or CPT. Automation prevents users from saving an activity as final until an explanation has been entered for each No response. Inspectors should write explanations in clear, concise, and complete sentences using proper spelling so that other CMT or CPT members can understand the findings without requiring further information. References to the air carrier’s documentation should be entered when appropriate.

c)      Not Applicable Response. Based on the air carrier’s oversight profile, the activity reporting screen should only contain questions that are applicable for that air carrier. If the inspector finds a question that does not apply to that air carrier, each SAI question offers an NA response option. Choosing this option is appropriate only for questions that are not applicable due to the types of operations authorized for the particular air carrier or applicant.

1.      NA means not at all applicable to air carrier’s or applicant’s operation. It does not mean the elements were not observed. If the question is not relevant to the specific activity or observation, the inspector leaves the question unanswered and completes additional activities or observations. Misuse or overuse of NA corrupts the data.

2.      If the inspector is unsure whether something observed was unsatisfactory or potentially unsatisfactory, the question should not be answered for that activity until the ASI does additional research and plans another activity to make a definitive determination.

d)      Not Observed Response. A Not Observed answer is used only on the last activity that an inspector intends to conduct before saving all EPI activities to the master record. A Not Observed answer means that while conducting the entire EPI, the inspector was not able to observe conditions required to answer the DCT question. This option is not intended to be routinely used in lieu of completing all assigned questions.
e)      Some Questions Require That a Name Be Entered. The name can be an individual, a department, a committee, or a position. Often, many organizations disperse authority and responsibility. The intent is to identify the highest level person who is responsible or has the authority for that particular element of the air carrier’s or applicant’s system.

H.    Entering Data Provided by Air Carriers or Applicants. Data collected by air carriers or applicants during self‑audits using an SAI DCT may be entered into the ATOS database by an inspector assigned to the CPT or CMT, or a designated ASA or AST. Automation provides the ability to identify the source of the data as provided by the air carrier or applicant.

10-215 Save SAI, EPI, DOR or ConDOR as Draft. (See flowchart process step 5.2.) Inspectors may save activity reports as Draft and then Final, or directly to final.

A.     ASTs or ASAs who enter data for inspectors or air carrier‑provided data, transcribe these observations completely and accurately into the ATOS database. Automation links all inspection records to the air carrier or to the reporting inspector. ASAs and ASTs can only save activity reports as draft status.

B.     SAI or EPI data that are in draft status may be modified or deleted by the reporting inspector.

C.     Data should be saved as draft within three working days of completing the activity.

10-216 Save SAI or EPI Activity, DOR, or ConDOR as Final. (See flowchart process step 5.3.) Upon deciding that the activity report is complete, accurate, and adheres to the data quality guidelines, the inspector saves it in final status.

A.     Data Entry Validation Checks. As the inspector enters the data, first‑level data entry edit and validation checks are applied to the data. Data entry validation in the common data field minimizes data entry errors.

B.     Final Status. Only an inspector can save the activity report as Final. Data can be saved directly to final without having to go through draft status. Once in the final status, no further changes can be made to the activity report.

C.     Save data as final within 5 working days of completing the activity, or as soon as possible.

10-217 Is SAI or EPI Complete? (See flowchart process step 5.4.) The reporting inspector reviews all activity reports (SAI and EPI) to ensure that data quality guidelines and the PI instructions have been followed.

Automation Controls. Automation ensures that all questions on an SAI or EPI record are answered at least once before they can be submitted for data review.

D.    Team Coordinator. In the design assessment reporting process, the SAI team coordinator (TC) reviews all SAI activity reports submitted by the SAI team members. The TC ensures that each activity includes, at a minimum, the activity start date, end date, and location.

10-218 Save SAI or EPI to Master Record. (See flowchart process step 5.5.)

A.     Saving an SAI. After deciding that the SAI record is complete, the TC saves the data to the SAI master record.

B.     Saving an EPI. After deciding the EPI record is complete, the reporting inspector saves the EPI to the master record.

C.     SAI or EPI Saved. Once an SAI or EPI has been saved to the master record, the entire record is then available to the data reviewer.


Table 10‑1, Data Quality Guidelines—Specific Data Requirements Table

Specific Data Requirements Table

Field

DOs and DO NOTs

Examples and Explanations

System

 

(DOR)

DO enter the appropriate system applicable to the observation from the drop down list provided for the field.

If the observation that occurred can be related to an ATOS system, select the appropriate system from the drop‑down list.

Example: “1.0 Aircraft Configuration Control.”

Subsystem

 

(DOR)

DO enter the appropriate subsystem applicable to the observation from the drop‑down list provided for the field.

If the observation that occurred can be related to an ATOS subsystem, select the appropriate subsystem from the drop‑down list.

Example: “1.3 Maintenance Organization.”

Element

 

(DOR*)

 

*Applies only to Element‑Based Observation DOR

Do enter the element applicable to the observation from the drop‑down list provided for the field.

If the observation that occurred can be related to an ATOS element, select the appropriate element from the drop‑down list.

Example: “1.3.1 Maintenance Program.”

Air Carrier

 

(DOR)

Do enter the air carrier applicable to the observation from the drop‑down list provided for the field.

The report must be directed at a specific air carrier.

The default value for the air carrier is the air carrier to which you are assigned; however, inspectors can submit a DOR for any air carrier.

 

Select the air carrier’s name from the drop‑down list provided.

 

Only ATOS air carriers are available in the drop‑down list.

PTRS Activity Code

(DOR*)

 

*Applies only to Other Observation DOR

DO enter the appropriate PTRS activity code applicable to the observation from the drop‑down list provided for the field.

If the observation that occurred can be related to a PTRS activity code, select the appropriate code from the drop‑down list. Note: Only 16XX, 36XX, and 56XX data collection codes are available. This field is for analytical purposes only.

Activity Start Date

 

(SAI, EPI, DOR, ConDOR)

Do select today’s date (default) or select month, day, and year from the drop‑down menu or open the pop‑up calendar from which a date can be selected.

“May 5, 2004

The appropriate date may be the default date (today’s date) or may be selected from the drop‑down menu or from the pop‑up calendar.

Activity End Date

 

(SAI, EPI, DOR, ConDOR)

DO select today’s date (default) or select month, day, and year from the drop‑down menu or the pop‑up calendar.

“May 5, 2004”

The appropriate date may be the default date (today’s date) or may be selected from the drop‑down menu or from the pop‑up calendar.


Table 10‑1, Data Quality Guidelines—Specific Data Requirements Table (Continued)

Specific Data Requirements Table

Field

DOs and DO NOTs

Examples and Explanations

Departure Point/Location

 

(SAI, EPI, DOR, ConDOR)

DO enter an airport identifier in the departure point/location field for all data collection activities.

If the data collection activity was not conducted on an airport, enter the airport identifier that is closest to the site of the data collection in the Departure Point/Location field.

Departure Point/Location and Arrival Point

 (SAI, EPI, DOR, ConDOR)

DO enter the three‑letter FAA airport identifier for airports within the 50 United States using all capital letters.

SFO for San Francisco International Airport.

 

 

DO enter the four‑letter International Civil Aviation Organization (ICAO) airport identifier for airports outside of the 50 United States using all capital letters.

Use EGLL for the London‑Heathrow airport instead of the LHR Official Airline Guide (OAG) identifier.

 

DO NOT use OAG or carrier‑created identifiers.

This normally applies only outside of the 50 United States. Use MMMX for Mexico City instead of the MEX OAG identifier.

Certified Repair Stations Number

(SAI, EPI, DOR, ConDOR)

DO enter the full Flight Standards designated certificate number of the repair station.

An example of a foreign repair station number is OXEY097L for Aeroelectronica. A domestic repair station number example is XE5R213O for Texas Aero Engine Services.

Aircraft

Registration

Number

 

(SAI, EPI, DOR, ConDOR)

DO enter an aircraft’s full registration number using the drop‑down table if an individual aircraft was involved in the data collection observation.

 

DO include the registration prefix as part of the entry.

N123DL

Some U.S. air carriers may use foreign registered aircraft. For statistical analysis reasons, it could be important to be able to discern what country holds the aircraft’s registration. Valid examples include:

N123DL, United States.

N123AA, United States.

G4321, Great Britain.

Make, Model,

Series

 

(SAI, EPI, DOR, ConDOR)

DO select a Make‑Model‑Series or a Make‑Model from the drop‑down list provided for the field if the activity involved aircraft.

If a particular aircraft was involved as the subject of the data collection or directly involved in the data collection, enter a Make‑Model‑Series from the drop‑down list.

DO NOT enter a Make‑Model‑Series or a Make‑Model if the activity did not involve aircraft.

If the activity was oriented to a fleet of aircraft that include several series of like makes and models, enter just the Make‑Model from the drop‑down list.



Table 10‑1, Data Quality Guidelines—Specific Data Requirements Table (Continued)

Specific Data Requirements Table

Field

DOs and DO NOTs

Examples and Explanations

Flight Number

 

(SAI, EPI, DOR, ConDOR)

DO enter the flight number if a revenue flight was involved in the observation and the reporting inspector was onboard the flight.

Maintenance, training, and administrative nonrevenue flight numbers may be entered if they are known; however, they are not mandatory.

 

DO NOT enter a prefix to the flight number.

A valid flight number entry for an American Airlines flight could be 1247.

An invalid flight number entry for the same American Airlines flight would be AA1247.

The automation knows the carrier was American Airlines because the record is associated with the American Airlines CAP.

Simulator Device

ID

 

(SAI, EPI, DOR, ConDOR)

DO enter the correct “Simulator ID” when a simulator was involved in the data collection.

The correct Simulator ID can be verified by the simulator certificate or by the SIMULATR.DB Paradox table in the FSAS folder located on your local area network.

FAA 8430‑13

Number

 

(SAI, EPI, DOR, ConDOR)

DO enter the 8430‑13 number on an EPI DCT when the purpose of being in the airplane is to collect data for an ATOS assessment.

 

If either the inspector or the air carrier CMT is not subject to ATOS 1.2 policies and procedures, then the activity is recorded in PTRS. The use of Form 8430‑13 for the same en route activity should never be reported in both ATOS and PTRS databases.

DO Record the data using a DOR if the cockpit or cabin en route activity conducted generates a negative observation unrelated to planned ATOS data collection, or if the activity is incidental to ATOS data collection, or if the inspector is not a member of the CMT for the air carrier operating the aircraft (e.g., traveling from one location to another to perform official business).

Local/Regional/

National Use field

 

(SAI, EPI, DOR,

ConDOR)

 DO enter, when applicable or directed, data in the Local/Regional/

National Use field.

Additional information specific to the inspection activity for tracking and analysis for specific requirements.

Off‑hour Button

(SAI, EPI, DOR,

ConDOR)

DO use the Off‑hour button to indicate an inspection activity performed after normal duty hours, to include weekends

 


Table 10‑1, Data Quality Guidelines—Specific Data Requirements Table (Continued)

Specific Data Requirements Table

Field

DOs and DO NOTs

Examples and Explanations

Response Not

Answered

(Left Blank)

 

(SAI, EPI)

DO schedule another SAI or EPI activity to observe the element question at a later time, if the question’s subject was not observed during the activity and is applicable to the carrier.

If the element question asked, “Were the written procedures adhered to for the Airworthiness Directive Management process?” and no procedures were observed, the response should not be selected and the explanation should be left blank.

Not Observed

(EPI)

DO use Not Observed when you have not been able to observe the situation described by a particular question.

 

For example, an inspector may not observe an intoxicated passenger during an entire passenger handling EPI.

Not Applicable

NA

 

(SAI, EPI, DOR*, ConDOR)

 

*Applies only to Element‑Based Observation DOR

DO enter NA when a particular question does not apply to the air carrier’s operation being evaluated.

NA is an appropriate response if the question does not apply to the air carrier’s type of operation, type of aircraft, or area of operation.

NA

Explanations

 

(SAI, EPI, DOR*, ConDOR)

 

*Applies only to “Element‑Based Observation” DOR

DO explain the reasons for your NA response.

NA is an appropriate response if the air carrier’s type of operation, type of aircraft, or area of operation does not apply. Enter a factual statement as to why the response was NA (e.g., ABC Airlines is not approved in their OpSpecs to conduct reduced vertical separation minimum operations).

Response Yes

 

(SAI, EPI, ConDOR)

 

 

DO enter Yes to indicate the requirements were met based on what was observed during the activity.

A Yes answer always indicates a positive response. Take great care when determining if the response is positive. If the inspector indicates a positive answer using a qualifier (e.g., “Yes, but…”), this may drive the answer to actually be a No.

Response “Yes”

 

(SAI, ConDOR)

 

A Yes response indicates the operator complies with observed specific regulatory requirements (SRR) and applicable FAA guidance for that element.

 

A Yes response also indicates the applicable safety attributes are incorporated into the operator’s system.

 

A yes response to any ConDOR SAI question requires an explanation or comment.


Table 10‑1, Data Quality Guidelines—Specific Data Requirements Table (Continued)

Specific Data Requirements Table

Field

DOs and DO NOTs

Examples and Explanations

Response Yes

 

(EPI, ConDOR)

 

A Yes response indicates that the observed system performance measures are accomplished.

A Yes response indicates the observed procedures approved/accepted for the air carrier are being followed.

Yes Comments

 

(SAI, EPI, ConDOR)

 

 

Yes comments are required for SAI questions and SAI questions in a ConDOR.

 

Yes comments associate with each specific question only, not the entire activity.

 

Yes comments must meet all current data quality guideline dimensions.

Describe how the air carrier complies with the regulatory requirement or FAA guidance.

 

Describe how and where the air carrier documents this requirement in their system.

Response No

 

(SAI, EPI, DOR*, ConDOR)

 

*Applies only to Element‑Based Observation DORs

DO enter No to indicate the requirements that were not met.

The questions are written so that No always indicates a negative response to the question. The significance of a No response depends on the specific DCT question that is being asked.

 

The intent was never that a single No answer would equate to an unsafe condition or a regulatory violation, unless that particular No has a regulatory basis and the inspector observed a possible violation or an unsafe condition.

Response No

 

(SAI, ConDOR)

 

A No response may indicate the operator either does not comply with observed SRRs and/or applicable FAA guidance for that element or that the operator’s procedures do not incorporate the applicable safety attribute.

 

A No response can also mean that system safety procedures are weak in the area being evaluated or that the operator’s approved or accepted procedures are inadequate.



Table 10‑1, Data Quality Guidelines—Specific Data Requirements Table (Continued)

Specific Data Requirements Table

Field

DOs and DO NOTs

Examples and Explanations

No Explanations

 

(SAI, EPI, DOR*, ConDOR)

 

*Applies only to Element‑Based Observation DOR

DO enter an explanation for all No responses.

An explanation of the “who”, “what”, “where”, “when”, “how”, and “why” that caused the “No” response must be entered.

DO write your explanation so that it answers the question in a responsive way.

The explanation must be pertinent to the question’s intent. The explanation should have a logical, precise relevance to the matter at hand.

DO write your explanation so it is understandable.

The explanation should be plain and comprehensible; written in clear, concise language.

Abbreviations acronyms used should be commonly understood within the aviation industry.

DO write your explanation so that it is technically correct, reliable, and free of error.

Explanations should be complete and descriptive so that someone knowledgeable within the air transportation industry can understand without requiring further information.

DO include references where appropriate.

The explanation should be grammatically correct, without spelling errors, and written with complete sentences that are punctuated and capitalized correctly. Codes of Federal Regulations and other references should be included in explanations.

DO make each explanation standalone

There is no direct link between the explanation for one question and another. Each explanation must standalone for effective analysis and reader understanding.

No Explanations

 

(SAI, EPI, DOR*, ConDOR)

 

*Applies only to Element‑Based Observation DOR

DO NOT refer to the explanation for another question.

See above, Same as question 3, or Refer to the Tulsa Main Base Report are all examples of references to avoid.

DO NOT use the explanation field to critique the ATOS process.

The Problem Reporting and Feedback hyperlink is the proper avenue to use for improvement suggestions or critiques.

DO NOT enter opinions in the explanation.

The explanation should be statements of fact or fact‑based conclusions.

DO NOT enter the word None by itself in the explanation field.

Use of spaces, periods, or other characters by themselves to circumnavigate the requirement for an explanation is not acceptable.



Table 10‑1, Data Quality Guidelines—Specific Data Requirements Table (Continued)

Specific Data Requirements Table

Field

DOs and DO NOTs

Examples and Explanations

ATA Codes (EPI)

 

DO select appropriate Air Transport Association (ATA) codes.

ATA codes should reflect the known primary and secondary aircraft systems that were identified as being related to the principle cause of the No response. Otherwise, leave the codes blank.

Comments field

 

(DOR*, ConDOR)

 

*Applies only to Other Observation DOR

DO enter actual observations.

 

Describe in detail what was observed and include all relative facts (i.e., who, what where, when, why, and how, as applicable).

DO NOT enter what actions the inspector conducted during the course of the observation.

Entries must be statements of fact, or fact‑based conclusions from actual observations. Inspectors should not enter a description of what they did to complete the particular inspection activity being reported.

Inspector Action Taken field “This is an optional field”

 

(SAI, EPI, DOR, ConDOR)

DO record actions taken by reporting inspectors as a result of the deficiencies observed.

Actions may include notifying appropriate air carrier personnel of a potential noncompliance, consulting with air carrier or other FAA officials to obtain additional information, or initiating an enforcement investigation.

DO NOT enter a description of what was done during the observation.

Inspectors should not enter a description of what they did to complete the particular inspection activity being reported.

RESERVED. Paragraphs 10‑219 through 10‑233.


Volume 10
Air Transportation Oversight System

CHAPTER 2 Procedures for Design and Performance Assessment

Section 7  Design and Performance Assessment Data Review

Figure 10‑51, Module 6. Data Review for Design and Performance Assessment

Module 6. Data Review flowchart

10-234 Introduction. The data review process ensures that quality data is entered into the Air Transportation Oversight System (ATOS) database for decisionmaking. The inspector provides the first level of validation by submitting complete, accurate, and quality data that complies with the data quality guidelines. The second level of data quality validation is supplied by automation. The data reviewer provides final validation and reviews activity records for the Certificate Management Team (CMT) or Certification Project Team (CPT).

A.     Defintion of Quality Data. Data are a set of facts that when compiled provides information for decisionmaking. Data represent real‑world objects. Data that conform to a defined specification and the specification correctly reflects the intended use yields acceptable data quality.

B.     Benefits of Quality Data. Quality data provide reliable measurements to assess the design and performance of an air carrier’s or applicant’s systems. Quality data help close the gap between the views of the real‑world air carrier or applicant system obtained by direct observation, and the view of the air carrier or applicant system obtained through data in the information system.

C.     Impact of Poor Quality Data. Poor quality data are incoherent and do not reflect real‑world conditions. Even accurate data that are redundant or not interpretable by the user are of little value. Deficient data will be mostly unusable. Poor quality data are costly; its impacts may include increased operational cost, difficulty in setting and executing strategy, and less effective decisionmaking.

D.    Inspector Responsibility for Quality Data. Each reporting inspector is responsible for submitting complete, accurate, and quality data.

E.     Measuring Data Quality. Commonly used characteristics or dimensions to measure data quality include: accuracy, appropriate amount of data, completeness, consistency, ease of understanding, objectivity, relevancy, timeliness, and validity. As with the attributes in ATOS, interdependencies exist between data dimensions.

1)      ATOS controls some data quality dimensions through automation.
2)      Inspectors play an important role by incorporating data dimensions in their reporting. Before submitting an activity record, inspectors review the data to reduce the possibility of being returned to the inspector for corrections.

10-235 Review SAI, EPI, ConDOR, or DOR Data. (See flowchart process step 6.1.) The data quality review occurs after Safety Attribute Inspection (SAI) activity, Element Performance Inspection (EPI), Constructed Dynamic Observation Report (ConDOR), or Dynamic Observation Report (DOR) data have been saved final.

A.     Data Reviewer. In most cases, data are reviewed by a data evaluation program manager (DEPM) or an alternate DEPM if that position is available for the CMT. For offices without a DEPM, the inspector’s frontline manager conducts the review. During an initial certification of an applicant, if a DEPM has not been assigned, the certification project manager (CPM) reviews the data collected by CPT members.

1)      Data reviewers must be qualified inspectors who are members of the CMT or CPT and have completed baseline training.
2)      As the data reviewer, inspectors should not review their own data.

B.     The data reviewer should evaluate any data saved to the master record within 7 days of its initial availability.

C.     Activity reports are reviewed to determine if the data meet the data quality guidelines. The specific requirements for ensuring that the highest quality data are reported to the ATOS database can be found in the Specific Data Requirements Table (Table 10‑1) at the end of the Data Reporting section. The standards for data review are located in the Data Dimensions Table (Table 10‑2) at the end of this section.

D.    The data reviewer determines the level of review. The reviewer may lessen the depth of review if the data quality level is sufficient. A greater focus should be placed on quality assurance (sample) versus quality control (every record) to minimize the depth of data review.

10-236 Does Data Content Meet Data Quality Guidelines? (See flowchart process step 6.2.) The data reviewer determines if data meet the data quality guidelines. Data reviewers work with the CMT or CPT, especially principal inspectors (PI) or CPM, to develop, implement, and evaluate office processes to ensure that the activity records meet the ATOS data quality guidelines. Data reviewers should alert the respective PI or CPM immediately if any critical or time‑sensitive information is found during the data review.

10-237 Save Data to the ATOS Database. (See flowchart process step 6.3.) If the data meets the defined data dimensions and specific data requirements, the data reviewer saves the data to the ATOS database. The data are then ready for analysis and assessment.

10-238 Return and Provide Feedback on Data. (See flowchart process step 6.4.) If the data reviewer determines that the data do not meet the ATOS data quality guidelines, he or she records specific feedback citing reasons. The data revert to draft status and are returned to the reporting inspector. The data reviewer coordinates a resolution of any data discrepancies in the inspection record with the reporting inspector.

10-239 Is There Inspector Agreement? (See flowchart process step 6.5.) The reporting inspector evaluates the returned inspection data and reviewer feedback and decides if he or she agrees with the data reviewer. If the inspector agrees with the feedback, he or she must decide on the appropriate action (e.g., editing the record, conducting additional observations, or taking no action) based on the nature of the issue.

10-240 Is it a Data Collection or a Data Reporting Issue? (See flowchart process step 6.6.) It is important to identify the cause of the discrepancies in the data to determine if the issue is insufficient data collection or the inaccurate data reporting.

A.     If the discrepancy is a data collection issue, the inspector collects more data using the appropriate Data Collection Tool, records the additional data and saves the record as final using the process described in the Data Collection and Data Reporting sections.

B.     If the discrepancy is a data reporting issue, the inspector corrects the data record and saves the record as final using the process described the Data Reporting section.

10-241 Initiate and Follow Process to Resolve Differences. (See flowchart process step 6.7.) If, after reading the data reviewer’s feedback and conferring with the data reviewer, the inspector still believes that the data conforms to the applicable data dimensions, the activity record is retained in its original form and the issue is elevated to the inspector’s frontline manager.

A.     The frontline manager considers the opinion and judgment of the employee and the technical guidance provided.

B.     When management changes or edits an employee’s data due to a professional difference of opinion, the file will be annotated to reflect who made the change and when. The provisions of the collective bargaining agreement will be followed.

10-242 Saving Data to the Master Record. (See flowchart process step 6.8.) After the issue is resolved, the data reviewer saves the SAI or EPI data to the master record. The master record is saved to the ATOS database. Design and performance assessment data that have been saved to the ATOS database are available for analysis and assessment by all CMT or CPT members.



Table 10‑2, Data Quality Guidelines—Data Dimensions Table

Data Dimensions Table

Data Dimension

Definition

Measurement Examples

Accuracy

 

(SAI, EPI, DOR, ConDOR)

Data must be technically correct, reliable, and error free.

All explanations and comments should be in complete sentences and spelled correctly.

 

Codes of Federal Regulations and other references should be included, where appropriate.

Appropriate Amount of Data

 

(EPI)

PIs may recommend a minimum number, location, and scope of inspection activities. The number of activities required to properly assess a given element may vary considerably. Perform enough activities should to accurately answer the questions on the DCT. It may not be reasonable to perform enough activities to ensure a specific statistical level of confidence. The activities conducted should vary across time and location to obtain sufficient amounts of quality observations to reflect the performance of the system element.

The number of activities required to answer all EPI questions varies depending on the complexity of the air carrier system, the size of the air carrier, and other factors.

 

The reporting inspector should follow the PI’s instructions that pertain to the minimum number of activities and the scope (time, location, etc.) of the inspection.

Appropriate Amount of Data

 

(SAI)

Answer each SAI question once to evaluate the adequacy of the system design.

SAI team coordinators should work with team members to plan inspection activities and ensure that each DCT question is answered once during the course of the inspection.

 

Although multiple activities may be required to complete an SAI, team members should avoid multiple responses to individual SAI questions

Appropriate Amount of Data

 

(DOR)

Each DOR consists of a single activity observation. If an observation consists of multiple findings related to the same system, subsystem, or element, complete a single DOR. If an observation consists of multiple findings relating to several different systems, subsystems, or elements, complete a new DOR for each separate finding.

Record a single‑activity unplanned observation that is unrelated to the ATOS element being inspected.

 

Report a single‑activity unplanned observation where there is not an ATOS element or question that addresses the unique situation.

 

Report a single‑activity unplanned observation on specific inspection events as directed by handbook bulletin or other national directive.


Table 10‑2. Data Quality Guidelines—Data Dimensions Table (Continued)

Data Dimensions Table

Data Dimension

Definition

Measurement Examples

Appropriate Amount of Data

 

(ConDOR)

The ConDOR is a special‑purpose DOR constructed by PIs with instructions to inspect and report on specific areas of immediate concern.

Follow PI instructions to inspect and report on a specific area of immediate concern.

All questions must be answered.

Completeness

 

(SAI, EPI, DOR, ConDOR)

Data must be of sufficient breadth, depth, and scope for the task at hand. All necessary and relevant data are captured to show a complete picture of the situation.

 

Enter all applicable common data field information.

 

At a minimum, every activity must include the activity start date, activity end date, and departure point/location.

 

If the activity involved an individual aircraft, the registration number and make, model, and series must be entered.

 

If the activity involved an aircraft fleet, the make and model must be entered.

 

If the activity involved an aircraft flight, the arrival point, departure point, flight number, and 8430‑13 number must be entered.

 

Explanations must include the “who”, “what”, “where”, “when”, and “how” to describe the observation.

 

Observations on SAI, EPI, DOR, or ConDOR that result in a No response due to an unsafe condition or possible regulatory noncompliance require action by the observing inspector that must be reported in the Reporting Inspector Action Taken text block.

 

Element‑based observation DORs must include a response to at least one question with an explanation or comment, if applicable.

 

Other observation DORs, other than those identified as “en route DOR only, no findings,” must include a complete description of the observed condition in the Comment block.

ConDORS must include a response to all questions.  An explanation or comment is required for both “yes” and “no” responses to ConDOR SAI questions.



Table 10‑2. Data Quality Guidelines—Data Dimensions Table (Continued)

Data Dimensions Table

Data Dimension

Definition

Measurement Examples

Consistency

 

(SAI, EPI, DOR, ConDOR)

The data should be presented in the same format and be compatible with previous data.

EPI/DOR/ConDOR: Responses, explanations, and comments within the activity report should not conflict with other responses, explanations, and comments within the same activity report.

SAI: Responses, explanations, and comments within the activity report should not conflict with other responses, explanations, and comments within the same activity report, or any other activity report within the same inspection record.

Ease of

Understanding

 

 

(SAI, EPI, DOR, ConDOR)

Data must be clear and comprehensible.

All explanations and comments should be written in clear, concise language.

Any abbreviations or undefined acronyms used should be commonly understood within the aviation industry.

 

The reviewer must be able to read and understand what the explanation or comment means.

 

Explanations and comments must be complete and descriptive, so that someone knowledgeable within the air transport industry can understand without requiring further information.

Objectivity

 

(SAI, EPI, DOR, ConDOR)

Data must be unbiased.

Explanations must be statements of fact or fact‑based conclusions, from actual observations, rather than inspector opinions.

Relevancy

 

(SAI, EPI, DOR, ConDOR)

The data should be valid and applicable to the observation or question being answered.

The response, explanation, or comment should directly relate to the specific question asked, and the Yes, No, or NA response that was selected for that question.

 

The methodology used to collect the data was appropriate.

 

Explanations and comments should not include administrative information. (e.g., “James Doe completed initial operating experience satisfactorily.”)


Table 10‑2. Data Quality Guidelines—Data Dimensions Table (Continued)

Data Dimensions Table

Data Dimension

Definition

Measurement Example

Timeliness

 

(SAI, EPI, DOR, ConDOR)

The age of the data must be appropriate for the task at hand. The inspection record should not be left open as a means to collect information that may present itself in the future.

Most activities should normally open and close in a single day.

 

The inspection data should be entered into the activity report and saved to final status as soon as practical after the activity is complete.

 

SAI and EPI data collection activities should be completed within the timeframes specified by the PI or CPM in the comprehensive assessment plan.

 

Since DORs record single‑activity observations, they should generally be complete in one day.

 

The reporting inspector should adhere to SAI or EPI instructions provided by the principal on timelines.

Value Added

 

(SAI, EPI, DOR, and ConDOR)

Data should be beneficial and provide advantages from their use.

The word None must not be entered as an explanation.

 

Each explanation and comment must stand alone and not refer to the response for another question (e.g., See above or Same as question 3).

 

Inspectors should not enter a description of what they did to complete the particular inspection activity being reported.

DORs and ConDORs should be used only to report an observation that the inspector makes. DORs are also used for ATOS data collection (e.g., traveling from one location to another to perform official business) or if the inspector is not a member of the CMT for the air carrier operating the aircraft.

RESERVED. Paragraphs 10‑243 through 10‑257.


Volume 10 Air Transportation Oversight System

CHAPTER 2 Procedures for Design and Performance Assessment

Section 8  Design Assessment Analysis

Figure 10‑52, Section 8 Design Assessment Analysis

Module 7. Design Assessment Analysis flowchart

10-258 Introduction. The analysis and assessment process module for design assessment determines if the air carrier’s or applicant’s system design meets the standards for acceptance or approval. The process uses data collected by Certificate Management Team (CMT) or Certification Project Team (CPT) members. The principal inspector (PI) or certification project manager (CPM) may use data from other sources to help make this bottom‑line design assessment.

10-259 Analyze SAI Data by Element. (See flowchart process step 7.1.)

A.     After the collected Safety Attribute Inspection (SAI) data have undergone a data quality review, the PI or CPM, with assistance from the Operations Research Analyst (ORA), analyzes the SAI data by element to determine whether the air carrier’s or applicant’s system element design meets the requirements for initial or continued FAA approval or acceptance.

B.     This analysis involves looking at responses to SAI questions for that element, including No responses and explanations, Yes responses and comments, responses by question category and drop‑down menu subjects, questions responded to as not applicable, and text entered in the inspector action taken box.

1)      A Yes response means that the inspector observed conditions at the air carrier or applicant, or in its documentation that meets the criteria specified by the Data Collection Tool (DCT) question.
2)      A No response means that the inspector observed conditions at the air carrier or applicant, or in its documentation, that does not meet the criteria specified by the DCT question.
3)      Drop‑down menus are available when DCT questions are answered No. Selection of one or more options should be used to narrow down the suspected cause for the identified deficiency.
4)      A Not Applicable answer means that the DCT question does not apply to the evaluated air carrier because of the type of operation, aircraft, area of operation, etc.

10-260 Assess Data in the ATOS Data Analysis Package Data by Element (Conduct Risk Assessment Meeting, If Required). (See flowchart process step 7.2.) SAI data that have been analyzed and assessed for the current design assessment are compared to all historical SAI and other data for that element.

A.     The PI or CPM uses the Air Transportation Oversight System (ATOS) data analysis package in the Safety Performance Analysis System (SPAS) to consolidate and summarize other relevant data needed to properly analyze and assess that element. By methodically assessing the data, the PI or CPM can make a bottom‑line assessment of system design.

1)      For existing air carriers that desire to add or change a program, historical SAI data provide a useful source of information to consider prior to approving or accepting the program.
2)      For new applicants, no historical SAI data may exist to aid the CPT in determining program adequacy. Initial SAI data submission, along with the information from the ATOS data analysis package may be the only sources of information available for analysis.

B.     Other reports, provided by the ORA or through SPAS, can be constructed in various ways to include the specific data necessary to more effectively complete the analysis and assessment process. The PI or CPM may request that the ORA generate additional analyses and reports to allow for adequate assessment of the element.

C.     The PI or CPM may convene a design assessment meeting with other CMT or CPT members, as required, to review data related to the element and aid in making the bottom‑line design assessment. This meeting may be held in conjunction with a performance risk assessment meeting. The PI or CPM can use the results of the design assessment meeting in decisionmaking, planning, and evaluating air carrier or applicant actions.

10-261 Does the Element Meet the Requirement for Acceptance or Approval? (See flowchart process step 7.3.) After assessing the ATOS data analysis package with input from other CMT or CPT members, as required, the PI or CPM determines whether the air carrier system design for that element meets the requirements for either continued approval or acceptance, or initial approval or acceptance.

10-262 5Document Accepted/Approved Design Assessment. (See flowchart process step 7.4.) After completing the bottom‑line assessment, the PI or CPM documents the decision to accept or approve the air carrier system design. (See Table 10‑3) The PI or CPM includes the rationale for the decision and also notes any issues or concerns.

10-263 Document Design Assessment Rejected. (See flowchart process step 7.5.) If the PI or CPM determines that the applicant’s or air carrier’s system design does not meet the requirements for approval or acceptance, the PI or CPM documents the decision to reject the system. The PI or CPM includes the rationale for the decision and also notes any issues or concerns.



Table 10‑3, Bottom‑line Design Assessment Categories

1

Design Accepted/Approved

No issues observed

No action required

2

Design Accepted/Approved

Minor issues observed

No action required

3

Design Accepted/Approved

Minor issues observed

Mitigation required

4

Design Accepted/Approved

Major issues observed

Mitigation required

5

Design Accepted/Approved

Safety and/or regulatory issues observed

Mitigation required

6

Design Rejected

Persistent, systemic safety and/or regulatory issues observed

System reconfiguration by air carrier or applicant required

Legend

Green: 1–2

Yellow: 3–5

Red: 6

RESERVED. Paragraphs 10‑264 through 10‑278.


Volume 10 Air Transportation Oversight System

CHAPTER 2 Procedures for Design and Performance Assessment

Section 9  Design Assessment Action Determination and Implementation

Figure 10‑53, Module 8. Design Assessment Action Determination and Implementation

Module 8. Design Assessment Action Determination and Implementation flowchart

10-279 Introduction. The action determination and implementation process module requires the principal inspector (PI) or certification project manager (CPM) to determine and document one or more appropriate courses of action in response to the bottom‑line design assessment. Actions may be required even if the design of the air carrier’s system meets the requirements for approval or acceptance. The PI or CPM must take action when the air carrier system design does not meet the requirements for approval or acceptance and the system is rejected.

10-280 System Design Accepted. If the design of an air carrier’s system meets all the regulatory and quality requirements and safety standards, the PI or CPM may accept or approve it, as applicable, and take no further action (other than issuing applicable operations specifications and approving system documentation).

A.     Is Mitigation Required? (See flowchart process step 8.1.) The PI or CPM determines whether the program or system design can be accepted or approved without mitigation based on the program or system design meeting regulatory requirements. The air carrier or applicant system may be approvable or acceptable but still have some weakness that concerns the PI or CPM. Acceptance or approval of the system with mitigation might also be appropriate when the system meets the explicit requirements of the regulations, but not their intent. The PI or CPM may determine that additional data collection, monitoring, or other mitigation is needed.

B.     Document Action is Not Required. (See flowchart process step 8.2.) Once the PI or CPM determines that the air carrier’s system design meets all the regulatory requirements and safety standards, and decides that no action is required, the decision is documented. No explanation is required.

C.     Is Enforcement Action Required? (See flowchart process step 8.3.) The PI determines if a potential violation of an FAA regulation is involved that may lead to an enforcement action. Enforcement action is required if an air carrier is, or has been, conducting operations contrary to applicable FAA regulations. The PI follows the procedures outlined in FAA Order 2150.3, Compliance and Enforcement Program, if enforcement action is required.

D.    Has a Systemic Hazard Been Identified? (See flowchart process step 8.4.) Before accepting or approving an air carrier or applicant system, with mitigation, the PI or CPM must determine the extent of the deficiencies and identify hazards that may exist.

1)      Some hazards are isolated incidences that can typically be attributed to performance and do not necessarily require system‑level changes, but they may be indications of systemic hazards.
2)      Systemic hazards are those that indicate defects in the design of the air carrier’s processes (e.g., missing procedures, poor controls, lack of attention to interfaces), patterns of repeated noncompliance with procedures, or significant changes in the operating environment. Controlling or eliminating systemic hazards requires modifications to the system design.

E.     Is a SAT Required? (See flowchart process step 8.5.) If a systemic hazard has been identified, the PI or CPM determines whether a System Analysis Team (SAT) would be beneficial. The SAT is formed at the discretion of the PI or CPM when further analysis is required to determine the cause of a systemic problem. The SAT can include participants from the CMT or CPT, other Federal Aviation Administration (FAA) personnel, airline and/or manufacturer representatives, and industry personnel to perform further analysis and determine cause. The SAT can be appropriate when the PI or CPM chooses to collaborate with the carrier or industry in identifying system deficiencies.

F.      Convene SAT. (See flowchart process step 8.6.)

1)      Composition of SAT. The PI or CPM determines the composition of the SAT depending on the nature of the issue. The PI or CPM should request input from the certificate holder regarding SAT composition.
2)      Request for Participation. The PI or CPM contacts personnel from the certificate holder and the FAA to request their participation on the SAT. The certificate holder coordinates the participation with non‑FAA participants, such as manufacturer representatives or other industry personnel.

G.    Is Risk Management Process Required? (See flowchart process step 8.7.) The PI determines whether it is necessary to initiate the ATOS risk management process (RMP) to address the specific systemic hazard. The RMP may be used to address any hazard that the PI or CPM decides is significant enough to justify more extensive analysis and tracking. Other possible considerations are the need for a formal action plan, participation of air carrier personnel, timeliness of required actions, regional or national significance, or the output of other tools such as the decision aid used to evaluate air carrier changes.

H.    Document Need for Risk Management Process. (See flowchart process step 8.8.) If the PI determines that the RMP is required to address the hazard, the decision and explanation is documented.

I.       Document Need for Performance Assessment or ConDOR. (See flowchart process step 8.9.) If a systemic hazard has not been identified, or if a decision is made to not initiate the RMP, but issues exist, the PI or CPM documents the need for a performance assessment or Constructed Dynamic Observation Report (ConDOR) to monitor the air carrier system or gather additional information. The decision to take this approach is documented and explained.

10-281 System Design Rejected. The PI or CPM must take action when the air carrier’s or applicant’s system design does not meet the requirements for approval or acceptance and the system is rejected. The PI or CPM documents the action taken and the rationale.

A.     Are Operations Specifications Modifications Required? (See flowchart process step 8.10.) The PI or CPM must take action when the air carrier’s or applicant’s system design does not meet the requirements for approval or acceptance and the system is rejected. The air carrier may be required to modify their system, or the FAA may modify their authorizations. If changes are required to the OpSpecs issued to the air carrier or applicant, the PI or CPM follows the procedures in Title 14 of the Code of Federal Regulations part 119.

B.     Is Enforcement Action Required? (See flowchart process step 8.11.) The PI determines if a potential violation of an FAA regulation is involved that may lead to an enforcement action. Enforcement action is required if an air carrier is, or has been, conducting operations contrary to applicable FAA regulations. The PI follows the procedures outlined in FAA Order 2150.3, Compliance and Enforcement Program, if enforcement action is required.

C.     Document Need for System Configuration. (See flowchart process step 8.12.) If the PI or CPM determines that the air carrier’s or applicant’s system design does not meet the requirements for approval or acceptance, the PI or CPM requires that the air carrier or applicant reconfigure its system design and resubmit its request for a new or updated scope of operation. The action and explanation are documented and the air carrier or applicant is notified.

10-282 Completion of Design Assessment. Once the PI or CPM determines and documents the action he or she intends to implement, the design assessment is complete. If this occurs within 30 days of the end of the quarter in which the design assessment is due, the assessment is considered to have been completed by the due date. The PI or CPM notifies the air carrier or applicant of the results of the design assessment verbally or in a closeout letter. The PI may also use this letter to notify the air carrier of ongoing design assessment results that occur in the normal planning cycle.

RESERVED. Paragraphs 10‑283 through 10‑297.


Volume 10 Air Transportation Oversight System

CHAPTER 2 Procedures for Design and Performance Assessment

Section 10  Performance Assessment Analysis and Assessment

Figure 10‑54, Module 7. Performance Analysis and Assessment

 

Module 7. Performance Assessment Analysis flowchart

10-298 Introduction. The analysis and assessment process module for performance assessment determines if the air carrier’s or applicant’s system performs as intended by regulations in such a way that it controls environmental hazards. The process uses data collected by Certificate Management Team (CMT) or Certification Project Team (CPT) members. The principal inspector (PI) or certification project manager (CPM) may use data from other sources to make this bottom‑line performance assessment.

10-299 Analyze EPI Data by Element. (See flowchart process step 7.1.) After the collected Element Performance Inspection (EPI) data undergoes a data quality review, the PI or CPM, with assistance from the operations research analyst (ORA), analyzes the EPI data by element to determine if the air carrier is following its system and meeting the established performance measures.

A.     This analysis involves looking at responses to EPI questions for that element, including No responses and explanations, Yes responses and comments, responses by question category and drop‑down menu subjects, questions responded to as Not Applicable or Not Observed, and text entered in the inspector action taken box.

B.     Before analyzing or assessing element performance, it is essential to understand the meaning of the various possible responses to each element’s EPI questions.

1)      A Yes response means that the inspector observed conditions at the air carrier or applicant, or in its documentation that meets the criteria specified by the Data Collection Tool (DCT) question.
2)      No response means the inspector observed a condition at the air carrier or applicant, or in its documentation that does not meet the criteria specified by the DCT question.
3)      Drop‑down menus are available when DCT question are answered No. Selection of one or more options is used to narrow down the suspected cause for the identified deficiency.
4)      A Not Applicable answer means that the DCT question does not apply to the air carrier being evaluated because of the type of operation, type of aircraft, area of operation, etc.
5)      A Not Observed answer means that while conducting the entire EPI, the inspector was not able to observe conditions required to answer the DCT question.

10-300 Assess ATOS Analysis Data by Element (Conduct Risk Assessment if Required). (See flowchart process step 7.2.) Once the EPI data for the current performance assessment have been analyzed by element, an analysis of these data in relation to other historical EPI and other data available for that element is conducted.

A.     This detailed analysis requires the use of the Air Transportation Oversight System (ATOS) data analysis package that consolidates and summarizes all other relevant data needed to properly analyze and assess that element. By methodically assessing the data, the PI or CPM can make a bottom‑line assessment of system performance.

1)      When conducting the analysis and assessment of the information contained within the ATOS data analysis package, it is beneficial to compare these findings to any pertinent information gathered while performing the EPI analysis.
2)      For existing air carriers who desire to add or change a program, historical EPI data also help provide a useful source of information for consideration prior to assessing system design for the added or changed program.
3)      For new applicants, no historical EPI data may exist to aid the CPT in assessing system performance.

B.     Other reports, provided by the ORA or through the Safety Performance Analysis System, can be constructed in various ways to include the specific data necessary to more effectively complete the analysis and assessment process. The PI or CPM may request that the ORA generate additional analyses and reports to allow for adequate assessment of the element.

C.     The PI or CPM may convene a performance risk assessment meeting with other CMT or CPT members, as required, to review the performance data and to aid in making the bottom‑line assessment of the system performance. This may be held in conjunction with the design risk assessment meeting. The PI or CPM can use the results of the performance risk assessment meeting in decisionmaking, action planning, completion of the next Air Carrier Assessment Tool, and evaluating air carrier or applicant actions.

10-301 Is Element Performance Affirmed? (See flowchart process step 7.3.) After assessing the ATOS data analysis package with input from other CMT or CPT members, the PI or CPM determines whether the air carrier system performance for that element is affirmed. The PI or CPM considers the data collected that are specific to the element as well as other applicable information from the ATOS data analysis package and design or performance risk assessment meeting results.

10-302 5Document Affirmed Performance Assessment. (See flowchart process step 7.4.) Once the PI or CPM completes the bottom‑line assessment, he or she documents the decision to affirm performance. (See Table 10‑4) The PI or CPM includes the rationale for the decision and also notes any issues or concerns.

10-303 Document Performance Not Affirmed. (See flowchart process step 7.5.) In some cases, the performance data indicates that the air carrier’s system is not performing as intended. If the PI or CPM determines that the applicant’s or air carrier’s system performance is not affirmed, the PI or CPM documents the decision. The PI or CPM includes the rationale for the decision and also notes any issues or concerns.


Table 10‑4, Bottom‑line Performance Assessment Categories

1

Performance Affirmed

No issues observed

No action required

2

Performance Affirmed

Minor issues observed

No action required

3

Performance Affirmed

Minor issues observed

Action Required

4

Performance Affirmed

Issues of concern observed

Action Required

5

Performance Not Affirmed

Safety and/or regulatory issues observed

Action Required

6

Performance Not Affirmed

Persistent, systemic safety and/or regulatory issues observed

System reconfiguration by air carrier or applicant is required

Legend

Green: 1–2

Yellow: 3–5

Red: 6

 

RESERVED. Paragraphs 10‑304 through 10‑318.


Volume 10 Air Transportation Oversight System

CHAPTER 2 Procedures for Design and Performance Assessment

Section 11  Performance Assessment Action Determination and Implementation

Figure 10‑55, Module 8. Performance Action Determination and Implementation

 

10-319 Introduction. The action determination and implementation process requires the principal inspector (PI) or certification project manager (CPM) to determine and document one or more appropriate courses of action in response to the bottom‑line performance assessment. Actions may be required even if the air carrier’s system performance has been affirmed. When the air carrier system performance is not affirmed, the PI or CPM must take action.

10-320 System Performance Affirmed. If the air carrier’s system meets all the performance standards, it is affirmed and the PI or CPM takes no further action.

A.     Is Additional Action Required? (See flowchart process step 8.1.) The air carrier’s or applicant’s system performance may be affirmed overall, but still indicate some isolated or minor problems. The PI may determine that additional data collection or monitoring or other action is needed.

B.     Document That Action is Not Required. (See flowchart process step 8.2.) Once the PI or CPM determines that the air carrier’s or applicant’s system performance is affirmed, and decides that no action is required, the decision is documented.

C.     Is Enforcement Action Required? (See flowchart process step 8.3.) The PI or CPM determines if a potential violation of a Federal Aviation Administration (FAA) regulation is involved that may lead to an enforcement action. Enforcement action is required if an air carrier is, or has been, conducting operations contrary to applicable FAA regulations. The PI follows the procedures outlined in FAA Order 2150.3, Compliance and Enforcement Program, if enforcement action is required.

D.    Has a Systemic Hazard Been Identified? (See flowchart process step 8.4.) The PI or CPM identifies any systemic hazard in the program that may need to be addressed. Hazards are conditions, events, or circumstances that could lead or contribute to an unplanned or undesired outcome.

1)      Some hazards are isolated incidences that can typically be attributed to performance and do not require system‑level changes, but they may be indications of systemic hazards.
2)      Systemic hazards are those that indicate defects in the design of the air carrier’s processes (e.g., missing procedures, poor controls, lack of attention to interfaces), patterns of repeated noncompliance with procedures, or significant changes in the operating environment. Controlling or eliminating systemic hazards requires modifications to the system design.

E.     Is a SAT Required? (See flowchart process step 8.5.) If a systemic hazard has been identified, the PI or CPM determines whether a System Analysis Team (SAT) would be beneficial. The SAT is formed at the discretion of the PI or CPM when further analysis is required to determine the root cause of a systemic problem. The SAT can include participants from the CMT or CPT, other FAA personnel, airline and/or manufacturer’s representatives, and industry personnel to perform further analysis and determine root cause. The SAT can be appropriate when the PI or CPM chooses to collaborate with the air carrier or industry in identifying system deficiencies.

F.      Convene System Analysis Team. (See flowchart process step 8.6.)

1)      Composition of SAT. The PI or CPM determines the composition of the SAT depending on the nature of the issue. The PI or CPM should request input from the certificate holder regarding SAT composition.
2)      Request for Participation. The PI or CPM contacts personnel from the certificate holder and the FAA to request their participation on the SAT. The certificate holder coordinates the participation with non‑FAA participants, such as manufacturers’ representatives or other industry personnel.

G.    Is Risk Management Process Required? (See flowchart process step 8.7.) The PI or CPM determines whether it is necessary to initiate the Air Transportation Oversight System risk management process (RMP) to address the systemic hazard. The RMP may be used to address any hazard that the PI or CPM decides is significant enough to justify more extensive analysis and tracking. Other possible considerations are the need for a formal action plan, participation of air carrier personnel, timeliness of required actions, regional or national significance, or the output of other tools such as the decision aid used to evaluate air carrier changes.

H.    Document Need for Risk Management Process. (See flowchart process step 8.8.) If the PI determines that the RMP is required to address the hazard, the decision and explanation is documented.

I.       Document Need for Design Assessment, Performance Assessment, or Constructed Dynamic Observation Reports (ConDOR). (See flowchart process step 8.9.) If a systemic hazard has not been identified, or if a decision is made to not initiate the RMP, the PI or CPM documents the need for a design assessment, performance assessment, or ConDOR to monitor the air carrier system or gather additional information. The decision to take this approach is documented.

10-321 System Performance is Not Affirmed. If the air carrier’s or applicant’s system performance is not affirmed, the PI or CPM must determine if the deficiency is due to a systemic problem.

A.     Is System Reconfiguration Required? (See flowchart process step 8.10.) When the air carrier or applicant’s system performance is not affirmed due to a systemic problem, action must be taken by the PI or CPM. The air carrier may be required to modify their system or the FAA may modify their authorizations. The PI or CPM documents the action taken and the rationale.

B.     Are Operations Specifications Modifications Required? (See flowchart process step 8.11.) If changes are required to the Operations Specifications issued to the air carrier or applicant, the PI or CPM follows the Title 14 of the Code of Federal Regulations part 119 procedures.

C.     Is Enforcement Action Required? (See flowchart process step 8.12.) The PI or CPM determines if a potential violation of an FAA regulation is involved that may lead to an enforcement action. Enforcement action is required if an air carrier is, or has been, conducting operations contrary to applicable FAA regulations. The PI follows the procedures outlined in FAA Order 2150.3 if enforcement action is required.

D.    Document Need for System Reconfiguration. (See flowchart process step 8.13.) When the air carrier’s or applicant’s system performance is not affirmed due to deficiencies in the system design, the air carrier may be required to modify their system or the FAA may modify their authorizations. Either way system reconfiguration will be involved and the PI or CPM documents the nature of the required change.

10-322 Completion of Performance Assessment. Once the PI or CPM determines and documents the action he or she intends to implement, the performance assessment is complete. If this occurs within 30 days of the end of the quarter in which the performance assessment is due, the assessment is considered to have been completed by the due date. The PI or CPM notifies the air carrier or applicant of the results of the performance assessment verbally or in a performance assessment closeout letter. The PI may also use this letter to notify the air carrier of ongoing performance assessment results that occur in the normal planning cycle.

RESERVED. Paragraphs 10-323 through 337.


Volume 10 Air Transportation Oversight System

CHAPTER 3 Risk Management Process

Section 1  Risk Management Process

Figure 10-56, Risk Management Process

Risk Management Process flowchart

10-338 Introduction. The risk management process (RMP) allows Certificate Management Teams (CMT) to effectively oversee the air carrier’s management of identified hazards and the associated risks.

A.     The CMT begins the RMP when hazards are identified during design assessment, performance assessment, or from other sources that require the initiation of the RMP.

B.     With the RMP, the CMT can document and track hazards, and ensure that the air carrier identifies, eliminates, and controls hazards and manages the associated risk

10-339 When to Use the Risk Management Process. The PIs use available data to identify hazards that may be present in the air carrier’s operating environment and systems. A CMT may use the RMP to address any hazard that the principal inspector (PI) or certification project manager (CPM) decides is significant enough to justify intensive analysis and tracking. The PI may designate another CMT member to develop the risk management action plan. The following should be considered when deciding if an RMP is appropriate:

A.     Systemic hazards rather than isolated incidents.

B.     The output of the tools in Order 8900.1, volume 6, chapter 2, section 18, .Evaluation of Air Carrier’s Management of Significant Changes.

C.     Local, regional, or national considerations.

D.    Timeliness of required actions.

E.     Any other unique factors.

10-340 Identify Hazards. (See flowchart process step 1.1.) A hazard is a condition, event, or circumstance that could lead or contribute to an unplanned or undesired event. The PI or designee identifies any hazard in the air carrier’s operating environment or systems. Data are analyzed from many sources to determine if hazards are isolated incidents or systemic problems. The Operations Research Analyst (ORA) continually monitors available data sources to identify events, trends, or patterns that indicate potential safety issues and reports them to the PI. The ORA also reviews issues that are already being tracked using an RMP to avoid duplication and identify any issues that might be related. Systemic hazards and their potential consequences are analyzed and assessed to determine the level of risk associated with the hazard. Without conducting a complete analysis, the PI may notify the air carrier of any isolated incidences that do not require a complete RMP. If the isolated incident leads to noncompliance, then national guidance must be followed for processing any enforcement action.

A.     Name and Describe the Identified Hazard. All members of the CMT should be alert for potential hazards and notify the PI if any are discovered. Once the hazard is identified, the PI or designee prepares a summary that describes the identified hazard, and includes relevant facts such as who, what, why, how often, and where.

B.     Determine and Document Potential Consequences. The PI or designee determines and documents the potential consequences that could result if the hazard is not addressed or corrected. These consequences could be any one of the following:

·        Equipment failure,

·        Human error,

·        Damage to equipment,

·        Procedural nonconformance,

·        Process breakdown,

·        Personal injury or death,

·        Regulatory noncompliance,

·        Decreased quality or efficiency, or

·        Other.

10-341 Analyze and Assess Risk. (See flowchart process step 1.2.) The next step in the RMP is risk analysis. The PI analyzes hazards to identify factors that affect the severity of the potential consequences and their likelihood of occurring. Identifying risk factors assists in risk analysis and provides specific targets for action plans. Risk factors identify what must later be mitigated to reduce the overall level of risk. An effective action plan should address risk factors by eliminating them or by reducing their impact.

A.     Risk Factors Are Known. The PI or designee determines whether there are known risk factors associated with the severity of the consequences and the likelihood of their occurrence. Risk factors are typically situational factors (e.g., operating conditions that promote corrosion, aging aircraft, or high‑cycle use of aircraft) or deficiencies in design or performance related to safety attributes (e.g., missing attributes or failure to adhere to procedures).

B.     Perform Further Analysis. When risk factors are unknown, the PI or designee must suspend the RMP and conduct additional research on the risk factors before assessing the risk. The PI or designee may use Constructed Dynamic Observation Reports (ConDOR), System Architecture and Interfaces (SAI), Element Performance Inspections (EPI), and the System Analysis Team (SAT) to obtain more information about the factors affecting the level of risk.

C.     Identify and Document Risk Factors. After identifying the risk factors associated with a hazard, the PI or designee provides a description of each risk factor selected. Identifying the risk factors facilitates the risk assessment process and provides a specific direction for subsequent action plans.

D.    Determine the Severity Value. The PI or designee determines the appropriate value related to the severity of the potential consequences, should they occur. Severity is assessed using the standard risk matrix status of high, medium, or low. Severity assessments are produced using a combination of available data and expert judgment. Severity is defined using the following scale:

1)      High—Potential loss (or breakdown) of an entire system or subsystem; an accident or incident.
2)      Medium—Potential moderate damage to an aircraft, partial breakdown of an air carrier system, or violation of regulations or company rules.
3)      Low—Potential poor air carrier performance or disruption to the air carrier.

E.     Determine the Likelihood Value. The PI or designee determines the appropriate value related to the likelihood of the consequences actually occurring. Likelihood is assessed using a combination of available data and expert judgment. Likelihood values standard are defined as follows:

·        Frequent—Continuously experienced,

·        Probable—Occurs often,

·        Occasional—Occurs several times, and

·        Remote—Unlikely, but could occur.

F.      Overall Risk Assessment Value. The PI considers the overall level of risk to determine the priority in ensuring that the air carrier addresses the hazard and its associated level of risk. This assessment assists the PI or designee in decisionmaking, action planning, and evaluating air carrier actions. The PI uses the information from the risk analysis to determine the overall level of risk using the following matrix:

Table 10‑5, Risk Matrix

10-342            Perform Decisionmaking. (See flowchart process step 1.3.) Based on the results of the risk assessment, the PI or designee determines the most appropriate course of action to be taken and addresses the hazard and associated level of risk. The PI or designee must decide (1) if action should be taken to ensure the air carrier eliminates the hazard and/or reduces the level of risk; (2) if the certificate should be monitored; and (3) if the responsibility for eliminating the hazard or reducing the level of risk should be transferred to another Flight Standards or Federal Aviation Administration (FAA) organization.

A.     Determine if Action is Within the Scope of the CMT’s Authority. The PI or designee should determine if ensuring that the air carrier eliminates the hazard or reduces the level of risk is within the scope of the CMT’s authority.

B.     Document that Action is Outside the CMT’s Authority. The PI or designee documents if the responsibility for ensuring that the air carrier eliminates the hazard or reduces the level of risk is not within the scope of the CMT’s authority.

C.     Close the RMP and Transfer the Risk to the Appropriate FAA Organization. When corrective action is beyond the CMT’s authority, the PI can allocate the authority, responsibility, and accountability for taking corrective action for the identified hazard to the appropriate FAA organization. This approach is used to address risks that may require actions such as rule changes, new or revised airworthiness directives, policy changes, and FAA safety recommendations. The information package sent to the receiving organization should include the risk management action plan. Once the responsibility is transferred, the PI or designee closes the RMP. The PI or designee might decide to follow up on the status of the issue that was transferred.

D.    Acceptable Levels of Risk. Where the overall level of risk falls into the blue area of the risk matrix, it may be accepted without further action. If the PI or designee determines that the risk level is within normally acceptable limits, no additional design or performance assessments are required beyond the normal oversight planning.

E.     Document Rational for Acceptable Levels of Risk. The PI or designee documents the rationale for determining that the risk level is within acceptable levels.

F.      Close the RMP and Monitor Through Normal Oversight Planning. The PI closes the RMP and monitors the hazard through the Air Transportation Oversight System (ATOS).

G.    Document Mitigation Rationale for Unacceptable Risk Levels. If the overall level of risk is found to be unacceptable, the PI or designee documents the mitigation rationale.

1)      Where combinations of severity and likelihood cause the overall level of risk to fall into the red area, the risk would be assessed as unacceptable and further work would be required to eliminate that associated hazard or control the factors that lead to higher risk likelihood or severity.
2)      Where the risk assessment falls into the yellow area, the risk may be accepted under defined conditions of mitigation. An example of this situation would be an assessment of the impact of an inoperative aircraft component that is deferred in accordance with a minimum equipment list (MEL). Defining an operational or maintenance procedure in the MEL would constitute a mitigating action that could make an otherwise unacceptable risk acceptable, as long as the defined procedure was implemented. These situations may also require continued special emphasis in the safety assurance function.

10-343 Implement the Decision. (See flowchart process step 1.4.) The PI or designee implements mitigation strategies to ensure that the air carrier addresses the identified hazard and unacceptable levels of risk. The air carrier, with CMT oversight usually carries out mitigation. Sometimes the CMT may use mitigation strategies that do not involve the participation of the air carrier (e.g., reevaluating air carrier program approvals, authorizations, deviations and exemptions, or amending or revoking the air carrier’s authority to conduct all or part of an operation, or by initiating and enforcement action). The PI or designee must identify the necessary actions to effectively oversee the air carrier’s mitigation of the hazard and associated levels of risk.

A.     Develop and Document Action Items. Action items describe what, how, where, and when an action should be done. The PI or designee develops and documents action items that address the risk factors. Mitigation strategies may include:

1)      Reevaluating the air carrier’s programs, approvals, authorizations, deviations, and exemptions.
2)      Amending or revoking the air carrier’s authority to conduct all or part of its operation.
3)      Initiating an enforcement investigation.
4)      Suspending the certification process.
5)      Convening the SAT.

B.     Select Personnel Resources. The PI selects individuals to perform action items through coordination with the individual’s frontline manager.

C.     Work Assignments Are Approved. The frontline manager determines whether to approve the work assignment. If the frontline manager does not approve the work assignment, the PI or designee selects another individual to perform the work assignment. If the manager approves the work assignment, the work may begin on those action items.

D.    Implement the Action Plan by Completing All Action Items. The assigned inspector completes their assigned action items. Throughout the course of the RMP, the PI or designee monitors the progress of the action items. The PI or designee ensures that either (1) all action items are complete, or (2) the current data indicates that the action plan has eliminated the hazard or reduced the associated risk to acceptable levels.

10-344 Validate the Effectiveness of the Decision. (See flowchart process step 1.5.) After all action items are complete, or data indicates that the action plan has eliminated the hazard or reduced the associated risk to acceptable levels, the PI or designee validates the effectiveness of the selected approach. The PI or designee reviews the status of the hazard and verifies that the air carrier has eliminated the hazard or mitigated the level of risk associated with the hazard to an acceptable level. After evaluating the results of the mitigation strategies, the PI or designee decides whether to close the RMP or to require the development and implementation of additional action items.

A.     Risk Factors Identified and Addressed. If all of the risk factors were not initially identified, the PI or designee returns to the Analyze and Assess Risks paragraph of this appendix and adds them to the list of risk factors. The PI or designee determines if the action items have addressed each risk factor to the extent possible and describes any changes that have occurred to the risk factors because of the action taken. If any of the identified risk factors are still present and contributing to an unacceptable level of risk, the PI or designee repeats Implement the Decision in this appendix to add action items as necessary.

B.     Update Risk Assessment. After determining that all risk factors have been addressed to the extent possible, the PI or designee reviews the hazard and its consequence descriptions, and determines if the severity and likelihood values can be revised based on the completed action plan. The PI uses this risk analysis information and the risk matrix to determine if the overall level of risk is affected.

C.     Level of Risk Acceptable. The PI or designee determines whether the risk level is within normal limits. If so, no additional action is required beyond the normal oversight planning. If the level of risk is unacceptable, the PI or designee returns to Implement the Decision to add additional action items, as necessary.

D.    Close the RMP. After determining that the risk level is acceptable, the PI or designee closes the RMP and monitors the hazard through design assessment and performance assessment. If the level of risk is acceptable based on mitigation, then the air carrier and FAA must continually monitor the mitigating strategy (risk control). This helps ensure that the action plan to control the risk continues to be effective for as long as the hazard and associated risk factors exist.

RESERVED. Paragraphs 10-345 through 10‑359.


Volume 10 Air Transportation Oversight System

CHAPTER 4 Air Carrier Evaluation Process

Section 1  The Air Carrier Evaluation Process

Figure 10-57, Air Carrier Evaluation Process

Air Carrier Evaluation Process flowchart

10-360 Introduction. The Air Carrier Evaluation Process (ACEP) provides the Flight Standards Service (AFS) with standard policies and procedures to evaluate Title 14 of the Code of Federal Regulations (14 CFR) part 121 air carriers at the national, regional, and district office or certificate management office levels. Evaluations are an extension of Air Transportation Oversight System (ATOS) design and performance assessments, and by invitation, could include members from the air carrier. ACEP allows for an in‑depth look at one or more air carrier systems and has four primary goals:

A.     Verify that the air carrier complies with applicable regulations.

B.     Promote a positive safety culture by reinforcing how system safety principles and concepts directly apply to air carrier oversight.

C.     Identify hazards and mitigate associated risks.

D.    Identify program strengths (e.g., potential best practices that other air carriers could emulate).

Note:   Any identified program strength(s) should be documented in the comment box of the applicable question of the Safety Attribute Inspection (SAI), Evaluation Project Manager (EPI) or Constructed Dynamic Observation Reports (ConDOR) per the Automation User Guide (AUG). Please note that an air carrier’s best practice(s) may be proprietary information that cannot be shared without the permission of the air carrier.

10-361 Determine the Need for Evaluation. (See flowchart process step 1.1.) The following conditions may indicate the need to conduct an air carrier evaluation.

·        The results of design or performance assessments,

·        Substantial change in air carrier management,

·        Substantial turnover in personnel or reduction in force,

·        Labor dispute,

·        Rapid expansion or growth,

·        Merger, takeover, or change in ownership,

·        Enforcement actions,

·        Noncompliant attitude,

·        Accidents/incidents/occurrences,

·        Department of Defense reviews,

·        Department of Transportation/Office of the Secretary of Transportation economic authority/insurance requirements,

·        Change in fleet type,

·        Substantial change in outsourcing,

·        Financial distress,

·        Substantial passenger or employee complaints, and

·        Hotline complaints.

10-362 Identify and Select Air Carriers for Evaluations. (See flowchart process step 1.2.) There are several ways that an air carrier might be selected for evaluation. These include:

A.     National Level. The director of AFS‑1 may identify and select a part 121 air carrier for an evaluation using this process. AFS managers may identify and request an evaluation of an air carrier through their chain of command. AFS‑1 notifies the appropriate regional division manager and the certificate‑holding district office (CHDO) manager of the air carrier selected for an evaluation.

B.     Regional Level. Under this process, the AFS region division manager responsible for the air carrier’s oversight may identify and select a part 121 air carrier for evaluation. The AFS regional division manager notifies the appropriate CHDO manager of the air carrier selected for an evaluation. AFS managers within a region may request a region‑level evaluation.

C.     Office Level. Under this process, a CHDO manager responsible for the air carrier’s part 121 oversight may identify and select the air carrier for an evaluation.

10-363 Determine the Type of Evaluation. (See flowchart process step 1.3.) ACEP has four types of evaluations, which include:

A.     Focused. This is an evaluation of some of the air carrier elements by completing design and/or performance assessments using the appropriate ATOS Data Collection Tools (DCT). This evaluation is performed during a specified time period.

B.     Comprehensive. This is an evaluation of all applicable air carrier elements by completing design assessments and/or performance assessments using the appropriate ATOS DCTs. This evaluation is performed during a specified time period.

C.     Program Review. A program review is an evaluation that focuses on one or more of the air carrier elements and is conducted upon several, if not all part 121 air carriers within all of AFS, a region, or a CHDO area of responsibility. Reports that use data from a program review must be deidentified (by not including the tracking number, the air carrier’s designator, or personnel names) and analyzed to determine problems, issues, concerns, trends, and program strengths.

Note:   Program review reports can be generated by an Operations Research Analyst (ORA) upon request of the appropriate manager.

D.    System Process Audit. A system process audit is an evaluation that focuses on validating the effectiveness of ATOS as the Federal Aviation Administration’s (FAA) oversight system. Reports that use data from a system process audit must be deidentified (by not including the tracking number, the air carrier’s designator, or personnel names) and analyzed to determine problems, issues, concerns, trends, and program strengths.

Note:   System process audit reports can be generated by an ORA upon request of the appropriate manager.

10-364 Decide on Evaluation Team Composition. (See flowchart process step 1.4.) The level and type of evaluation, along with the complexity of the air carrier determines the composition of the evaluation team. The team must include:

A.     Evaluation Team Leader. AFS‑1, the regional AFS division manager, or the CHDO manager, depending on the level of evaluation, designates a team leader. The team leader is responsible for ensuring that the evaluation is conducted in accordance with ACEP. The evaluation team leader guides the evaluation team members on a daily basis during the evaluation period. The team leader is responsible to ensure that team members receive air carrier‑specific training applicable to the scope of the evaluation.

B.     Evaluation Team Members. Evaluations teams may consist of local CHDO personnel or personnel from another CHDO, regional or national specialists, and air carrier personnel.

1)      FAA Team Members. These members must be selected based on individual qualifications and experience to ensure that a quality evaluation is accomplished. Team members are selected in accordance with current FAA guidance, including the Collective Bargaining Agreement, as amended.
2)      Principal Inspectors. Principal inspectors (PI) responsible for the certificate management of the air carrier being evaluated must be part of the evaluation team. The PIs must not be assigned any air carrier elements, but they may assist team members in evaluating elements. PIs must be available to evaluation team members to provide clarification pertaining to such items as program approvals, authorizations, and exemptions that apply to their assigned air carrier. PIs must help resolve issues that are identified during the evaluation.
3)      Air Carrier Team Members. Evaluations conducted in partnership with the air carrier must include air carrier personnel as active participants of the evaluation team. When participating in the evaluation, air carrier personnel collaborate in determining and resolving element evaluation issues. The FAA, however, cannot delegate its responsibilities and final decisionmaking with regard to issues involving compliance with FAA statutes, regulations, and orders.

10-365 Complete Evaluation Agreement. (See flowchart process step 1.5.) AFS‑1, the regional AFS division manager, or the CHDO manager, depending on the level of evaluation, must develop an evaluation agreement. The agreement will document the:

Note:   An evaluation agreement template is included at the end of this section (see Figure 10‑58).

A.     Level and type of evaluation.

B.     Air carrier elements to be evaluated.

C.     DCTs to be used in that evaluation SAI, EPI, and/or ConDOR.

D.    Evaluation period, including tentative start and completion dates.

E.     Designated team leader.

F.      Composition of evaluation team.

G.    Participation of air carrier personnel.

10-366 Notify Air Carrier of the Evaluation. (See flowchart process step 1.6.) The air carrier should be informed of the planned evaluation and, at the discretion of the FAA, be given the opportunity to participate. The initial notification can be verbal, but should be followed up in writing. The written notification should be sent at least two weeks prior to the evaluation by the appropriate FAA manager and should contain the same information as the evaluation agreement.

Note:   A notification letter template is included at the end of this section (see Figure 10‑59).

10-367 Partnership With the Air Carrier. The air carrier may participate in the evaluation in partnership with the FAA.

A.     When an air carrier participates in the evaluation, the team leader will ensure that all key management officials (as defined in 14 CFR § 119.65) of the air carrier receive a briefing on ACEP and the provisions of Advisory Circular (AC) 00‑58, Voluntary Disclosure Reporting Program, current edition, prior to beginning the evaluation.

Note:   A PowerPoint briefing resides on the News & Documentation page of the ATOS V1.2 automation.

B.     The air carrier’s management must understand the requirement to use the appropriate SAI to assess and document the comprehensive fix of any identified air carrier element deficiency that involves an apparent violation of FAA regulations.

10-368 Voluntary Disclosure. When an air carrier is an active participant of the evaluation team, any apparent violation of FAA regulations discovered during the specified evaluation period (as specified in the evaluation agreement) and subsequent enforcement action is governed by the provisions of AC 00‑58.

10-369 Evaluations Not Conducted in Partnership With the Air Carrier. If an air carrier elects not to collaborate with FAA in the evaluation, or when FAA decides not to include air carrier personnel as team members, the provisions and protections contained in AC 00‑58 do not apply to apparent violations of FAA regulations discovered during the specified evaluation period.

10-370 Revise the Comprehensive Assessment Plan. (See flowchart process step 1.7.) The PI(s) for the air carrier revises the Comprehensive Assessment Plan (CAP) in accordance with ATOS planning policies and procedures to include the elements targeted for evaluation. The PI(s) documents the data collection requirements.

Note:   Update the Air Carrier Assessment Tool (ACAT) with the appropriate evaluation element(s) per the Automation User Guide (AUG), Updating an Existing ACAT. Comments should be included to indicate that these element(s) are part of an Air Carrier Evaluation.

Note:   Update the CAP per the AUG, Changing the Schedule of a Particular Performance Assessment or Design Assessment. Move the evaluation element(s) into the quarter that the evaluation is to be conducted. Comments should be included that these element(s) are part of an Air Carrier Evaluation.

A.     When it is not necessary to complete an entire SAI or EPI, or if those tools do not focus on the specific issues, the PI(s) may be asked to create a ConDOR.

Note:   Refer to AUG, Creating a ConDOR.

B.     The evaluation may require the use of other specialized jobs aids, tools, documents, or guidance.

10-371 Assign Resources to Complete the Evaluation. (See flowchart process step 1.8.) Frontline managers assign team members to data collection activities to support the evaluation.

Note:   The Data Evaluation Program Manager (DEPM)/Data Reviewer/Frontline Manager (FLM) adds evaluation team members to the CMT roster in accordance with the AUG, Adding an Existing ATOS User to Carrier Roster.

10-372 Collecting Evaluation Data. (See flowchart process step 1.9.) Team members perform activities to collect data in accordance with the evaluation agreement, CAP, and evaluation team leader instructions.

A.     Coordinate Team and Establish Communication Methods. The team leader decides how the team communicates. Coordination and communication are especially important if members are spread among different locations.

B.     Team Meeting. The team leader organizes a team meeting after reviewing the ATOS automation instructions. This meeting can be in person, over the phone, or by other means. During this meeting team members are briefed on ACEP and the contents of the evaluation agreement.

C.     Distribute and Schedule Tasks. The tasks may be distributed by element, safety attribute, individual questions, or some combination to one or more team members to allow the timely collection of accurate data.

10-373 Report Evaluation Data. (See flowchart process step 1.10.) After collecting evaluation data, each FAA team member submits their responses into the ATOS database in accordance with the data quality guidelines.

A.     Evaluation teams enter data collected by air carrier personnel and indicate the source in the ATOS database.

B.     Communication between team members is essential, but sharing answers is not necessary or desirable because of possible duplication.

10-374 Reviewing Evaluation Data (Team Leader). (See flowchart process step 1.11.)

A.     The evaluation team leader is the data reviewer. The data reviewer determines if data meets the data quality guidelines. The PI should be notified immediately if any critical or time‑sensitive information is found during the data review.

Note:   The DEPM/Data Reviewer/FLM must designate the TL as the data reviewer for the data collected for the evaluation element(s) per the AUG, Editing a CMT Members Job Function.

B.     The data review process ensures that quality data are available for decisionmaking. Automation provides initial validation to ensure that the data fields contain air carrier‑specific data. The data reviewer provides secondary validation and reviews activity records submitted by the evaluation team.

10-375 Analyze and Assess Evaluation Data. (See flowchart process step 1.12.) After all data has been collected, reported, and reviewed, the evaluation team completes the evaluation using the ATOS analysis and assessment procedures and tools (ATOS Modules 7 and 8).

A.     The objective of the analysis and assessment process is to determine if the air carrier’s system is designed, or performs as intended by regulations in such a way that it controls the conditions that led to the decision to conduct an evaluation.

B.     The evaluation team leader and the operations research analyst (ORA) analyze the data by element to determine if the air carrier’s system design or performance meets the standards for acceptance or approval. Data may be collected from other sources to assist the team leader in making a bottom‑line assessment.

C.     The evaluation team leader documents the bottom‑line assessment, then includes the rationale for the decision and notes any issues or concerns.

Note:   The DEPM/data reviewer/FLM can give additional automation functionality to the TL to add and update the Assessment Determination and Implementation Tool (ADI) per the AUG, adding additional functionality to a CMT member.

10-376 Finalize Evaluation Report. (See flowchart process step 1.13.) Evaluation team leaders will generate an evaluation report when requested by either AFS‑1, the regional AFS division manager, or the CHDO manager.

Note:   Upon request by the appropriate manager the evaluation team leader must have the assigned ORA generate a report of the data that was collected during the evaluation of the element(s).

10-377 Determine and Implement Followup Actions. (See flowchart process step 1.14.) PIs are responsible for determining and implementing followup actions in response to the design and performance assessments completed during the evaluation. PIs should consider whether they need to initiate enforcement actions; reevaluate air carrier approvals, authorizations, deviations, or exemptions; recommend an FAA policy or regulation change; recommend the issuance of an airworthiness directive; or schedule a followup evaluation. The PI documents one or more appropriate courses of action.

A.     The PI need not take any further action if the air carrier system design is acceptable or approvable or performance of an air carrier’s system is affirmed.

B.     The design of the air carrier’s system may meet the requirements for acceptance or approval, and/or performance of the system may be affirmed, but the evaluation may have identified some weakness that concerns the evaluation team. In those cases, the PI should document additional data collection, monitoring, or other action.

C.     The PI must take additional action when the evaluation team determines that air carrier system design does not meet the requirements for approval or acceptance, or that performance was not affirmed.

1)      Risk Management Process. The PI determines whether it is necessary to initiate the Risk Management Process (RMP) to address a systemic hazard. The RMP may be used to address any hazard that the PI decides can justify more extensive analysis and tracking. Factors that may influence the decision to use the RMP are:

·        Hazards that can justify more extensive analysis and tracking;

·        The need for a formal action plan;

·        Participation of air carrier personnel;

·        Timeliness of required actions;

·        Regional or national significance; and

·        The output of tools such as the decision aid used to evaluate air carrier changes.

Note:   See AUG, entitled Risk Management.

2)      System Analysis Team. If a systemic hazard is identified, the PI determines whether the System Analysis Team (SAT) would be beneficial. The SAT is formed at the PI’s discretion when further analysis is required to determine the cause of a systemic problem. The SAT can include participants from the CMT, other FAA personnel, airline and/or manufacturer representatives, and other industry personnel to perform further analysis to determine root cause. The SAT can be appropriate when the PI chooses to work collaboratively with the carrier or industry in identifying system deficiencies.
3)      Enforcement Action. Enforcement action is required if an air carrier is, or has been, conducting operations contrary to applicable FAA regulations. If enforcement action is required, the PI follows the procedures outlined in FAA Order 2150.3, Compliance and Enforcement Program, and documents that action with explanation on the ADI tool.
4)      System Reconfiguration. The air carrier may be required to modify its system, or the FAA may modify its authorizations. If changes are required to the operations specifications issued to the air carrier or applicant, the PI or certification project manager follows the procedures of part 119. If changes are made to the air carrier’s system configuration, the air carrier may need to submit a request for a new or up to date scope of operation. The action and explanation are documented in the ADI tool.

10-349 Brief the Air Carrier and Other Stakeholders. (See flowchart process step 1.15.) The PI briefs the air carrier on the results of the evaluation.


Figure 10-58, Evaluation Agreement

 

Evaluation Agreement

Note:   This document is a generic sample of a written agreement between the certificate-holding district office(CHDO)/ certificate management office(CMO) and the regional staff and/or ATOS CMO addressing the process and procedures to be used during the performance of an Air Carrier Evaluation. This document must be prepared jointly by the CHDO/CMO and the regional staff or ATOS CMO and must be modified to fit the specific needs of the evaluation.

There are four types of air carrier evaluations for ATOS air carriers: Focused, Comprehensive, Program Review, and System Process Audit.

A.     Focused. This evaluation of some air carrier elements is executed by completing design and/or performance assessments using the appropriate ATOS Data Collection Tools (DCT). This evaluation is performed during a specified time period.

B.     Comprehensive. This evaluation of all applicable air carrier elements is executed by completing design assessments and/or performance assessments using the appropriate ATOS DCTs. This evaluation is performed during a specified time period.

C.     Program Review. A program review is an evaluation that focuses on one or more air carrier elements and is conducted upon several, if not all, part 121 air carriers within all of AFS, one particular region, or a CHDO area of responsibility. Reports that use data from a program review must be deidentified (by not including the tracking number, the air carrier’s designator, or personnel names) and analyzed to determine problems, issues, concerns, trends, and program strengths.

D.    System Process Audit. A system process audit is an evaluation that focuses on validating the effectiveness of ATOS as the Federal Aviation Administration’s (FAA) oversight system. Reports that use data from a system process audit must be de-identified (by not including the tracking number, the air carrier’s designator, or personnel names) and analyzed to determine problems, issues, concerns, trends, and program strengths.

In accordance with FAA Order 8900.1, volume 10, chapter 4, section 1, The Air Carrier Evaluation Process, and under the authority of Title 49, of the United States Code (49 U.S.C.) section 44709, as amended, and Title 14 of the Code of Federal Regulations (14 CFR) part 119, § 119.59, a (__type__) Evaluation (or Program Review) of (__name__) Airlines will be conducted by the (__name__) CHDO/CMO and ATOS CMO and/or Regional personnel.

(__name__) Airlines is the holder of Air Carrier Certificate (__number__) and is currently conducting (__type__) operations. The company’s main base of operation is located at (__where__).  The company’s fleet of aircraft consists of (__number and type of aircraft__).

The (Evaluation or Program Review) will be conducted at the (__national, regional, local__) level.

The (Evaluation or Program Review) will commence on (__date__) at the (__location__) with an Air Carrier Evaluation Program briefing, an introduction of FAA personnel, the company’s personnel, a tour of the company’s facilities, and an overview of their operations.

The (Evaluation or Program Review) Objective: The objective of the (Evaluation or Program Review) is to:

Verify that the air carrier complies with applicable regulations.

E.     Promote a positive safety culture by reinforcing how system safety principles and concepts directly apply to air carrier oversight.

F.      Identify hazards and mitigate associated risks.

G.    Identify program strengths (e.g., potential best practices that other air carriers could emulate).

This (Evaluation or Program Review) has no set completion date, but will end when the objectives of the evaluation have been met.

For the purpose of the Air Carrier Evaluation Program the provisions of Advisory Circular 00-58, Voluntary Disclosure, paragraph 7 (C) may be used.

1)      At a minimum, the CHDO/CMO must provide, in addition to the Certificate Management Team (CMT), (__number__) Operations Inspectors, (__number__) Maintenance Inspectors, (__number__) Avionics Inspectors, (__number__) Dispatcher Inspectors (if applicable), a Cabin Safety Inspector (if applicable), a Data Evaluation Program Manager (DEPM), and administrative support if necessary.
2)      The appropriate office must designate an Evaluation Team Leader (______Name_______) as follows:

 

·        An Evaluation Team member from the CHDO/CMO shall be designated as the Evaluation team leader if the evaluation is conducted at the local level.

·        An Evaluation team member from the Region shall be designated as the Evaluation team leader if the evaluation is conducted at the regional level.

An Evaluation team member from the ATOS CMO shall be designated as the Evaluation team leader if the evaluation is conducted at the national level.

3)      If requested, the ATOS CMO shall provide (__number__) Operations Inspectors, (__number__) Maintenance Inspectors, (__number__) Avionics Inspectors, (__number__) Dispatcher Inspectors (if applicable), and a Cabin Safety Inspector (if applicable).
4)      If requested, the (_________) Region shall provide (__number__) Operations Inspectors, (__number__) Maintenance Inspectors, (__number__) Avionics Inspectors, (__number__) Dispatcher Inspectors (if applicable), and a Cabin Safety Inspector (if applicable).
5)      Any specialized personnel who will be participating in the evaluation should be listed in this paragraph (________________).
6)      If the evaluation is conducted in partnership with the air carrier, personnel from the air carrier will participate in the evaluation as active participants and working members of the evaluation team.
7)      The personnel assigned by CHDO/CMO, the ATOS CMO, and/or the Region, and the air carrier personnel, plus any specialized personnel, shall constitute the Evaluation Team.
8)      The (__type__) Air Carrier (Evaluation or Program Review) will consist of a review of the air carrier elements listed below:

Elements

2.1.1 Manual Currency

2.1.2 Manual Content Consistency

Evaluation Team Roles and Responsibilities.

The Evaluation Team Members, except for the principal inspectors (PIs) assigned to the air carrier, shall not discuss any comprehensive fix or corrective action plans with the operator during the evaluation.
9)      Any significant safety concerns discovered during the evaluation shall be immediately brought to the attention of the PIs, and the Evaluation Team Leader. Upon concurrence that there is an actual safety concern, the appropriate PI shall notify the air carrier.
10)  Any and all safety issues that involve the actual safety of passengers or crew shall be addressed immediately by the discovering inspector.
11)  The evaluation team member who discovers a compliance issue requiring enforcement action will complete Section B, obtain all necessary items of proof, and provide this information to the appropriate PI.
12)  Any written communication between the evaluation team and the air carrier will be from the assigned Principals. During the evaluation, the Evaluation Team Leader shall be made aware of any written communication pertinent to the Evaluation, prior to it being sent.

________________________                                    _________________________

Evaluation Team Leader (ETL)                         CHDO / CMO Manager

_________________________                                  _________________________

Concurrence Date                                                        Concurrence Date                   

Figure 10-59, Notification Letter

Notification Letter

This document is a generic sample of a notification letter between the CHDO/CMO and the air carrier.

 

(Date)

CERTIFIED-RETURN RECEIPT

(Name)

(Title)

(Address)

(Address)

(Address)

Dear Mr. (Name):

The (Name) (Flight Standards District Office or Certificate Management Office), in coordination with the (Insert appropriate FAA offices) will be conducting an (Evaluation or Program Review) of (Name of airline), operating under part 121(Flag, Domestic, Supplemental) Air Carrier Certificate No. (Insert number).

The objectives of the (Evaluation or Program Review) are to:

Verify that the air carrier complies with applicable regulations.

H.    Promote a positive safety culture by reinforcing how system safety principles and concepts directly apply to air carrier oversight.

I.       Identify hazards and mitigate associated risks.

J.      Identify program strengths (e.g., potential best practices that other air carriers could emulate).

This (Evaluation or Program Review) will be conducted under the inspection authority of 49 U.S.C., section 44709. As we agreed on (Date), this evaluation (will or will not) be conducted in partnership.  An air carrier person assigned to an air carrier element needs to possess a thorough knowledge and understanding of the company’s policies and procedures pertaining to that element.

The (Evaluation or Program Review) is scheduled to begin on (Date), at (Name of airline) corporate headquarters in (Location).  An integral part of the (Evaluation or Program Review) process is a briefing for your company’s personnel who will be participating in the evaluation. Personnel in the management positions listed in § 119.65 are expected to attend. We respectfully request that you make the necessary arrangements so that all these participants may attend. Please call our office so that we can schedule this session for a mutually acceptable date, time and location.

The evaluation period is scheduled to last approximately (_____) weeks; however, the team will continue their evaluation until all scheduled activities are completed. The (Evaluation or Program Review) is focused on the following carrier elements:

Element Number           Name of Element

         1.3.11                 Continuing Analysis and Surveillance

         3.1.8                   Carriage of Cargo

         4.2.3                   Training of Flight Crewmembers

(Name) will be assigned as the (Insert appropriate FAA office) team leader during the evaluation. Should you have any additional questions or comments, please feel free to contact me at 123-456-7890.

Sincerely,

(Name)

Manager, (CHDO/CMO Name)

RESERVED. Paragraphs 10‑378 through 10‑392.


Volume 10 Air Transportation Oversight System

CHAPTER 5 Off‑Hour Surveillance Assessment Decision Aid

Section 1  Off-Hour Surveillance Air Carrier Oversight Process

Figure 10‑60, Off Hour Surveillance Air Carrier Oversight Process

10-393 Introduction.

A.     It is essential to identify and record how much, and what kind of activity the air carrier performs during off hours. Based on this information, the certificate management office (CMO)/ Certificate Management Team (CMT) will evaluate an air carrier’s ability to adequately manage its off-hour activities. The CMO/CMT must take appropriate action to address any identified hazards, to include retarget/adjust the Comprehensive Assessment Plan (CAP), or other actions designed to address a specific, significant risk. This section describes a process that can be used to prepare this evaluation.

B.     The following conditions or events may be indicators of need for additional off-hour surveillance. Particularly where multiple indicators or multiple examples of single indicators are observed, inspectors should consider more in-depth inquiries with air carrier management or targeted off-hour surveillance to determine possible impacts on affected programs or air carrier systems. The Off-Hour Surveillance Assessment Decision Aid helps the CMO/CMT evaluate the effectiveness of air carrier activities conducted during off hours.

10-394 Process Participants.

A.     CMO/CMT Principal Inspector (PI) and Certification Project Manager (CPM). The key participants in the Off-Hour Surveillance Assessment Process include the PI/CPM assigned the oversight or initial certification of an air carrier. These people are responsible for deciding how to anticipate or respond to air carrier risks, and for identifying what information is needed to make these decisions. When faced with a potential problem associated with off-hour activities, these participants must decide if a critical problem exists that must be handled immediately, and if these problems warrant the CAP to allow the CMO/CMT to effectively evaluate and manage potential risks, to enter into the Risk Management Process (RMP), or to collect additional data through the use of additional Design Assessments, Performance Assessments, or ConDORs.

B.     Aviation Safety Inspector(s) (ASI). The ASI will participate through the collection and reporting of data assigned to them through the CAP.

C.     Air Carrier. The air carrier is a participant in this process as the overseen entity, but also as a potential source of information for the evaluation process.

D.    Evaluate Available Off-hour Information. The PI must review the Off-Hour Surveillance Decision Aid to identify what information is needed to use the tool. They are also encouraged to use their experience with the air carrier and other data sources to evaluate the adequacy of information about off-hour activities. If there is a lack of sufficient information about the amount and type of activities being conducted off hours by, or for, the carrier, then it is necessary to collect that information to make an informed decision about whether or not a significant problem exists.

Note:   Inspections that are performed off hours should have the term “OFFHOUR” in the National use block. This will allow a national look at the amount and type of inspections being conducted off hours.

10-395 Recognize and Communicate Concerns. If a significant concern is discovered in the off-hour activities being performed by, or for, the air carrier, this must be communicated to the CHDO office manager or PI immediately. Once sufficient information is gathered to make the assessment, proceed with the process.

10-396 Required actions based on analysis of the Off-hour Decision Aid.

A.     Initiate RMP. A low decision aid (8-40) score reflects an inadequate capability to manage off-hour activities, and requires the initiation of the RMP that targets the specific off0hour hazards and creates an action plan to address the related risks. The action plan generated by the RMP will be initiated and closed by the PI.

B.     Retarget Surveillance. A moderate decision aid (41-56) score indicates that the air carrier has only a moderate ability to manage off-hour activities, and assessment plans should be retargeted to closely monitor this condition. The completion of an Air Carrier Assessment Tool will aid in developing a surveillance plan that concentrates on the elevated risk areas.

C.     Continue Current Surveillance Program. A high decision aid (57-80) score indicates the air carrier’s ability to manage off hour activities is considerable and the existing surveillance program should be continued. However, if particular issues of concern exist, then they must be addressed.

10-397 Instructions. For each of the eight questions below, rate each question based on -information available and your knowledge of the certificate holder. Once all questions have been answered, utilize the table on the last page to determine the results of this assessment.

A.     Amount, Complexity and Type of In-House Activities. (includes operations, maintenance and ground).

1)      Amount of activities conducted at off hours.
2)      Complexity of activities conducted at off hours.
3)      Type of activities conducted at off hours.

Table 10‑6, Amount, Complexity and Type of In-House Activities

Score

Word Picture

1-2

Concerns exist about the certificate holder regarding three of the above issues.

3-5

Concerns exist about the certificate holder regarding two of the above issues.

6-7

A concern exists about the certificate holder regarding one of the above issues.

8-9

A minor concern exists about the certificate holder regarding one of the above issues.

10

The certificate holder’s off-hour activities are acceptable.

B.     Amount, Complexity and Type of Outsourced Activities. (includes operations, maintenance and ground).

1)      Amount of outsourced activities conducted at off hours.
2)      Complexity of outsourced activities conducted at off hours.
3)      Type of outsourced activities conducted at off hours.

Table 10‑7, Amount, Complexity and Type of Outsourced Activities

Score

Word Picture

1-2

Concerns exist about the certificate holder regarding three of the above issues.

3-5

Concerns exist about the certificate holder regarding two of the above issues.

6-7

A concern exists about the certificate holder regarding one of the above issues.

8-9

A minor concern exists about the certificate holder regarding one of the above issues.

10

The certificate holder’s off-hour outsourced activities are acceptable.

C.     Facilities.

1)      Adequacy of off-hour, in-house maintenance facilities (e.g., lighting, HVAC, working on ramps, etc.).
2)      Adequacy of off-hour ground handling and servicing facilities.
3)      Adequacy of off-hour outsource maintenance facilities (e.g., lighting, HVAC, working on ramps, etc.).

Table 10‑8, Facilities

Score

Word Picture

1-2

Concerns exist about the certificate holder regarding three of the above issues.

3-5

Concerns exist about the certificate holder regarding two of the above issues.

6-7

A concern exists about the certificate holder regarding one of the above issues.

8-9

A minor concern exists about the certificate holder regarding one of the above issues.

10

The certificate holder has adequate facilities (both in-house and outsource).

D.    Supervision and Maintainers (In-House).

1)      Reduction of off-hour supervisors.
2)      Qualifications and expertise of the off-hour supervisors.
3)      Reduction of non-supervisory off-hour personnel.
4)      Qualifications and expertise of non-supervisory off-hour personnel.

Table 10‑9, Supervision and Maintainers (In-House)

Score

Word Picture

1-2

Concerns exist about the certificate holder regarding three or more of the above issues.

3-5

Concerns exist about the certificate holder regarding two of the above issues.

6-7

A concern exists about the certificate holder regarding one of the above issues.

8-9

A minor concern exists about the certificate holder regarding one of the above issues.

10

The certificate holder has a very stable and qualified off-hour workforce.

E.     Supervision and Maintainers. (Outsourcing).

1)      Effective oversight of off-hour outsourced activities.
2)      Adequate oversight of off-hour outsourced activities.
3)      Adequacy of the number of off-hour contracted personnel.
4)      Qualifications and expertise of off-hour contracted personnel.

Table 10‑10, Supervision and Maintainers. (Outsourcing)

Score

Word Picture

1-2

Concerns exist about the certificate holder regarding three or more of the above issues.

3-5

Concerns exist about the certificate holder regarding two of the above issues.

6-7

A concern exists about the certificate holder regarding one of the above issues.

8-9

A minor concern exists about the certificate holder regarding one of the above issues.

10

The certificate holder has a stable and qualified off-hour contracted personnel.  Additionally, the certificate holder has adequate and effective oversight of outsourced activities.

F.      Air Carrier Management and Oversight.

1)      Adequacy of the operator's off-hour maintenance inspection department/system.
2)      Adequacy of the operator's maintenance of its Continuing Analysis and Surveillance System (CASS) Audit and Performance Monitoring System.
3)      Effectiveness of changeover procedures.
4)      Effective management of off-hour maintenance controlled through supervision, training, shift change over and RII.

Table 10‑11, Air Carrier Management and Oversight


Score

Word Picture

1-2

Concerns exist about the certificate holder regarding three or more of the above issues.

3-5

Concerns exist about the certificate holder regarding two of the above issues.

6-7

A concern exists about the certificate holder regarding one of the above issues.

8-9

A minor concern exists about the certificate holder regarding one of the above issues.

10

The air carrier management and oversight processes are stable.

G.    Current Compliance Status.

1)      The level of operator's cooperative relationship with the FAA certificate management team.
2)      Compliance culture of the operator.
3)      Number of airworthiness regulatory enforcement actions.
4)      The results of Safety Performance Analysis System (SPAS) indicators.

Table 10‑12, Current Compliance Status

Score

Word Picture

1-2

Concerns exist about the certificate holder regarding three or more of the above issues.

3-5

Concerns exist about the certificate holder regarding two of the above issues.

6-7

A concern exists about the certificate holder regarding one of the above issues.

8-9

A minor concern exists about the certificate holder regarding one of the above issues.

10

The certificate holder is compliant.

H.    Training.

1)      Adequacy of air carrier’s training provided to off-hour maintenance, operations and ground personnel.
2)      Effectiveness of air carrier’s training provided to off-hour maintenance, operations and ground personnel.
3)      Adequacy of air carrier’s outsourced training provided to off-hour maintenance, operations and ground personnel.
4)      Effectiveness of air carrier’s outsourced training provided to off-hour maintenance, operations and ground personnel.

Table 10‑13, Training

Score

Word Picture

1-2

Concerns exist about the certificate holder regarding three of the above issues.

3-5

Concerns exist about the certificate holder regarding two of the above issues.

6-7

A concern exists about the certificate holder regarding one of the above issues.

8-9

A minor concern exists about the certificate holder regarding one of the above issues.

10

The certificate holder is adequate and effective off-hour training program.

I.       Overall Score. After all the questions have been answered, add all the scores to obtain the overall score. Using the table below, determine what actions are necessary to ensure adequate surveillance is being planned for the operator.

Table 10‑14, Training

Overall Score

Actions

8-40

The operator seems to have major issues with off-hour activities.  Begin a Risk Management Process immediately and closely track all issues of concern.

41-56

The operator seems to have some issues with off-hour activities.  Utilize an ACAT or SEAT to further determine a course of action.

57-80

The operator does not seem to have any issues with off-hour activities.  However, if particular areas of concern exist, then those must be addressed.

RESERVED. Paragraphs 10‑398 through 10‑412.


Volume 10 Air transportation oversight system

CHAPTER 6 THE CERTIFICATION PROCESS of Part 121 Air Carriers

Section 1  General Information

10-413 PURPOSE. This chapter provides detailed guidance on the certification process of Title 14 of the Code of Federal (14 CFR) part 121 air carriers. During the certification process, the certificate-holding district office (CHDO) and AFS-900 certification team will form a Certification Project Team (CPT). The CPT will follow the Certification Process Document (CPD) found in 8900.1 Volume 10, Chapter 6, Section 2, Certification Process Document. Under no circumstances will an applicant be certificated until the CHDO, regional Flight Standards division (RFSD) offices, and AFS-900 are confident that the prospective certificate holder is able to provide service at the highest degree of safety in the public interest.

10-414 INITIAL INQUIRIES OR REQUESTS. Initial inquiries about certification or requests for application may come in various formats from individuals or organizations. These inquiries may be in writing or in the form of meetings with CHDO personnel. Requests for applications may come from inexperienced and poorly-prepared individuals, from well‑prepared and financially sound organizations, or from individuals and organizations ranging between these extremes. Upon initial contact, CHDO personnel should provide the applicant with a Preapplication Statement of Intent (PASI) and request they return the completed form to the CHDO. The CHDO personnel should also provide the applicant with the address of the certification process page: http://www.faa.gov/safety/programs_initiatives/oversight/atos/air_carrier/ and advise them that the information found in this web site will assist them during the certification process.

Note:   CHDO personnel should become familiar with at least the information in Phase I of the CPD.

10-415 AFS‑900 NOTIFICATION.

A.     Certification Services Oversight Process (CSOP) Initiation. When the PASI is acceptable; the CHDO manager will initiate the CSOP found in FAA Order 8000.92, Certification Services Oversight Process for Original Organizational Certification.

B.     E‑Mail. Per the CSOP process, the CHDO manager shall notify AFS-900 Managers via e‑mail. The completed PASI should be attached to the e‑mail.

10-389 CERTIFICATION PROCESS DOCUMENT.

A.     Background. The CPD provides step-by-step work instructions to the Certification Project Team (CPT). Any deviations to this process must be requested per the Quality Management System AFS‑900‑001 deviation process.

B.     Process. The CPD contains work instructions organized into four phases and three gates. A phase separates the certification process into related activities supporting a specific function. A gate is a set of prerequisites that must be met before proceeding to the next step. Within each phase, the CPD provides detailed guidance in the form of action statements that must be accomplished to fulfill its function. This document identifies the person responsible for each action.

10-416 Phase 1: Application. This phase begins when the applicant submits a request for a formal application meeting to the CHDO. During this phase, the AFS‑900 certification team briefs the CHDO on the certification process. A formal application meeting is tentatively scheduled after the FAA receives all submissions required in the Preapplication Checklist (PAC). The CPT reviews the applicant’s PAC submissions for completeness and accuracy before confirming the formal application meeting date. During the formal application meeting, the applicant’s management personnel must demonstrate knowledge of their air carrier’s system design. Phase 1 ends when the CPT accepts the formal application package and all Gate I requirements are met.

10-417 Phase 2: Design Assessment. The CPT evaluates the design of the applicant’s operating systems to ensure their compliance with regulations and safety standards, including the obligation to provide service at the highest level of safety in the public interest. This phase uses SAIs to collect data that will be used to determine if the air carrier’s system design meets all regulatory requirements. Phase 2 ends when all manuals have been accepted or approved, and all Gate II requirements have been met.

10-418 Phase 3: Performance Assessment. Inspectors use EPIs during this phase to collect data that will be used to determine if the applicant’s systems are performing as intended and producing the desired results. This phase requires the operation of an aircraft to aid in the assessment of the applicant’s system design. Proving tests begin only after all Gate III requirements are met. Phase 3 ends after the successful completion of the proving tests.

10-419 Phase 4: Administrative Functions. This phase provides for completion of all administrative functions (e.g., issuance of the air carrier certificate and operations specifications).

RESERVED. Paragraphs 10‑420 through 10‑434.


Volume 10 Air transportation oversight system

CHAPTER 6 THE CERTIFICATION PROCESS of Part 121 Air Carriers

Section 2  Certification Process Document

10-435 GENERAL. This section contains the text of the Certification Process Document (CPD) in its entirety. This section also provides direct links to reference material including briefing guides; meeting agendas; training requirements; and other guidance material used during the process.

10-436 PHASE 1—APPLICATION.

1.1       Applicant Requests Formal Application Meeting

1.1.1    Applicant‑Contact the certificate‑holding district office (CHDO) to schedule a formal application meeting date. Make this request at least 45 calendar days prior to the proposed formal application meeting to allow the FAA to prepare resources. The items listed below from the Pre-application Checklist (PAC) (refer to “Pre-application Checklist”) must be submitted at this time:

·        Formal application letter

·        Completed Management Qualification Summary Form and Quality Audit Form

·        List of proposed operations specifications (OpSpecs)

·        An up to date Pre-application Statement of Intent (PASI) if there have been any changes to the original PASI

·        A proposed schedule of events (Refer to “Schedule of Events”)

1.1.2    CHDOAdvise the ATOS leadership team that an applicant has requested a formal application meeting, and e-mail a copy of the submitted documents to ‘AVS-AFS900-ATOS-Leadership Team.
1.1.3    CPM‑Request a new labor distributing reporting (LDR) project code by completing LDR Form for New Project Code and Revisions (MS Word) at: http://intranet.faa.gov/FAAemployees/Org/LineBusiness/AVS/Offices/AFS/LDR/

1.2       Establish Certification Project Team

1.2.1    AFS‑900 Certification Section Manager‑Assign the AFS-900 certification team.
1.2.2    CTL‑Review the applicant’s submissions to become familiar with the applicant’s operation.
1.2.3          CHDO Manager and CTL‑Identify certification project team (CPT) members.
1.2.4    CPM‑If a data evaluation program manager is not assigned to the project, Ensure a CPT member(s) is assigned to act as the data reviewer(s).
1.2.5    CTL‑Create a Project Management Tool (PMT) record per AFS‑900‑001‑WI‑02 process.

1.3       For